Sample records for minimization fundamental principles

  1. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program.

  2. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  3. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Fundamental principles underlying procedures... Fundamental principles underlying procedures. (a) The following general principles underlie the procedures... operating forces on a unit basis (e.g., conversation-minute-kilometers per message, weighted standard work...

  4. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  5. Clinical Pharmacokinetics in Kidney Disease: Fundamental Principles.

    PubMed

    Lea-Henry, Tom N; Carland, Jane E; Stocker, Sophie L; Sevastos, Jacob; Roberts, Darren M

    2018-06-22

    Kidney disease is an increasingly common comorbidity that alters the pharmacokinetics of many drugs. Prescribing to patients with kidney disease requires knowledge about the drug, the extent of the patient's altered physiology, and pharmacokinetic principles that influence the design of dosing regimens. There are multiple physiologic effects of impaired kidney function, and the extent to which they occur in an individual at any given time can be difficult to define. Although some guidelines are available for dosing in kidney disease, they may be on the basis of limited data or not widely applicable, and therefore, an understanding of pharmacokinetic principles and how to apply them is important to the practicing clinician. Whether kidney disease is acute or chronic, drug clearance decreases, and the volume of distribution may remain the same or increase. Although in CKD, these changes progress relatively slowly, they are dynamic in AKI, and recovery is possible depending on the etiology and treatments. This, and the use of kidney replacement therapies further complicate attempts to quantify drug clearance at the time of prescribing and dosing in AKI. The required change in the dosing regimen can be estimated or even quantitated in certain instances through the application of pharmacokinetic principles to guide rational drug dosing. This offers an opportunity to provide personalized medical care and minimizes adverse drug events from either under- or overdosing. We discuss the principles of pharmacokinetics that are fundamental for the design of an appropriate dosing regimen in this review. Copyright © 2018 by the American Society of Nephrology.

  6. The "Fundamental Pedogagical Principle" in Second Language Teaching.

    ERIC Educational Resources Information Center

    Krashen, Stephen D.

    1981-01-01

    A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…

  7. Fundamental Principles of Stem Cell Banking.

    PubMed

    Sun, Changbin; Yue, Jianhui; He, Na; Liu, Yaqiong; Zhang, Xi; Zhang, Yong

    2016-01-01

    Stem cells are highly promising resources for application in cell therapy, regenerative medicine, drug discovery, toxicology and developmental biology research. Stem cell banks have been increasingly established all over the world in order to preserve their cellular characteristics, prevent contamination and deterioration, and facilitate their effective use in basic and translational research, as well as current and future clinical application. Standardization and quality control during banking procedures are essential to allow researchers from different labs to compare their results and to develop safe and effective new therapies. Furthermore, many stem cells come from once-in-a-life time tissues. Cord blood for example, thrown away in the past, can be used to treat many diseases such as blood cancers nowadays. Meanwhile, these cells stored and often banked for long periods can be immediately available for treatment when needed and early treatment can minimize disease progression. This paper provides an overview of the fundamental principles of stem cell banking, including: (i) a general introduction of the construction and architecture commonly used for stem cell banks; (ii) a detailed section on current quality management practices; (iii) a summary of questions we should consider for long-term storage, such as how long stem cells can be stored stably, how to prevent contamination during long term storage, etc.; (iv) the prospects for stem cell banking.

  8. 75 FR 71317 - Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Part IV The President Executive Order 13559--Fundamental Principles and Policymaking Criteria for... Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other Neighborhood... the following: ``Sec. 2. Fundamental Principles. In formulating and implementing policies that have...

  9. Fundamental Tactical Principles of Soccer: A Comparison of Different Age Groups.

    PubMed

    Borges, Paulo Henrique; Guilherme, José; Rechenchosky, Leandro; da Costa, Luciane Cristina Arantes; Rinadi, Wilson

    2017-09-01

    The fundamental tactical principles of the game of soccer represent a set of action rules that guide behaviours related to the management of game space. The aim of this study was to compare the performance of fundamental offensive and defensive tactical principles among youth soccer players from 12 to 17 years old. The sample consisted of 3689 tactical actions performed by 48 soccer players in three age categories: under 13 (U-13), under 15 (U-15), and under 17 (U-17). Tactical performance was measured using the System of Tactical Assessment in Soccer (FUT-SAT). The Kruskal Wallis, Mann-Whitney U, Friedman, Wilcoxon, and Cohen's Kappa tests were used in the study analysis. The results showed that the principles of "offensive coverage" (p = 0.01) and "concentration" (p = 0.04) were performed more frequently by the U-17 players than the U-13 players. The tactical principles "width and length" (p < 0.05) and "defensive unit" (p < 0.05) were executed more frequently by younger soccer players. It can be concluded that the frequency with which fundamental tactical principles are performed varies between the gaming categories, which implies that there is valuation of defensive security and a progressive increase in "offensive coverage" caused by increased confidence and security in offensive actions.

  10. Fundamental Principles of Coherent-Feedback Quantum Control

    DTIC Science & Technology

    2014-12-08

    in metrology (acceleration sensing, vibrometry, gravity wave detection) and in quantum information processing (continuous-variables quantum ...AFRL-OSR-VA-TR-2015-0009 FUNDAMENTAL PRINCIPLES OF COHERENT-FEEDBACK QUANTUM CONTROL Hideo Mabuchi LELAND STANFORD JUNIOR UNIV CA Final Report 12/08...foundations and potential applications of coherent-feedback quantum control. We have focused on potential applications in quantum -enhanced metrology and

  11. Fundamental Tactical Principles of Soccer: A Comparison of Different Age Groups

    PubMed Central

    Guilherme, José; Rechenchosky, Leandro; da Costa, Luciane Cristina Arantes; Rinadi, Wilson

    2017-01-01

    Abstract The fundamental tactical principles of the game of soccer represent a set of action rules that guide behaviours related to the management of game space. The aim of this study was to compare the performance of fundamental offensive and defensive tactical principles among youth soccer players from 12 to 17 years old. The sample consisted of 3689 tactical actions performed by 48 soccer players in three age categories: under 13 (U-13), under 15 (U-15), and under 17 (U-17). Tactical performance was measured using the System of Tactical Assessment in Soccer (FUT-SAT). The Kruskal Wallis, Mann-Whitney U, Friedman, Wilcoxon, and Cohen’s Kappa tests were used in the study analysis. The results showed that the principles of “offensive coverage” (p = 0.01) and “concentration” (p = 0.04) were performed more frequently by the U-17 players than the U-13 players. The tactical principles “width and length” (p < 0.05) and “defensive unit” (p < 0.05) were executed more frequently by younger soccer players. It can be concluded that the frequency with which fundamental tactical principles are performed varies between the gaming categories, which implies that there is valuation of defensive security and a progressive increase in “offensive coverage” caused by increased confidence and security in offensive actions. PMID:28828091

  12. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an

  13. Energy Literacy : Essential Principles and Fundamental Concepts for Energy Education

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Energy Literacy: Essential Principles and Fundamental Concepts for Energy Education presents energy concepts that, if understood and applied, will help individuals and communities make informed energy decisions.

  14. Generalized minimal principle for rotor filaments.

    PubMed

    Dierckx, Hans; Wellner, Marcel; Bernus, Olivier; Verschelde, Henri

    2015-05-01

    To a reaction-diffusion medium with an inhomogeneous anisotropic diffusion tensor D, we add a fourth spatial dimension such that the determinant of the diffusion tensor is constant in four dimensions. We propose a generalized minimal principle for rotor filaments, stating that the scroll wave filament strives to minimize its surface area in the higher-dimensional space. As a consequence, stationary scroll wave filaments in the original 3D medium are geodesic curves with respect to the metric tensor G=det(D)D(-1). The theory is confirmed by numerical simulations for positive and negative filament tension and a model with a non-stationary spiral core. We conclude that filaments in cardiac tissue with positive tension preferentially reside or anchor in regions where cardiac cells are less interconnected, such as portions of the cardiac wall with a large number of cleavage planes.

  15. On the Support of Minimizers of Causal Variational Principles

    NASA Astrophysics Data System (ADS)

    Finster, Felix; Schiefeneder, Daniela

    2013-11-01

    A class of causal variational principles on a compact manifold is introduced and analyzed both numerically and analytically. It is proved under general assumptions that the support of a minimizing measure is either completely timelike, or it is singular in the sense that its interior is empty. In the examples of the circle, the sphere and certain flag manifolds, the general results are supplemented by a more detailed and explicit analysis of the minimizers. On the sphere, we get a connection to packing problems and the Tammes distribution. Moreover, the minimal action is estimated from above and below.

  16. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  17. Stem cell bioprocessing: fundamentals and principles.

    PubMed

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-06

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications.

  18. Fundamental Principles in Aesthetic Rhinoplasty

    PubMed Central

    2011-01-01

    This review article will highlight several fundamental principles and advances in rhinoplasty. Nasal analysis has become more sophisticated and thorough in terms of breaking down the anomaly and identifying the anatomic etiology. Performing this analysis in a systematic manner each time helps refine these skills and is a prerequisite to sound surgical planning. Dorsal augmentation with alloplastic materials continue to be used but more conservatively and often mixed with autogenous grafts. Long term outcomes have also taught us much with regards to wound healing and soft tissue contracture. This is best demonstrated with a hump reduction where the progressive pinching at the middle vault creates both aesthetic and functional problems. Correcting the twisted nose is challenging and requires a more aggressive intervention than previously thought. Both cartilage and soft tissue appear to have a degree of memory that predispose to recurrent deviations. A complete structural breakdown and destabilization may be warranted before the nose is realigned. This must be followed by careful and meticulous restabilization. Tip refinement is a common request but no single maneuver can be universally applied; multiple techniques and grafts must be within the surgeon's armamentarium. PMID:21716951

  19. Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.

    ERIC Educational Resources Information Center

    Nowaczyk, Ronald H.; James, E. Christopher

    1993-01-01

    Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…

  20. Stem cell bioprocessing: fundamentals and principles

    PubMed Central

    Placzek, Mark R.; Chung, I-Ming; Macedo, Hugo M.; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Min Cha, Jae; Fauzi, Iliana; Kang, Yunyi; Yeo, David C.L.; Yip Joan Ma, Chi; Polak, Julia M.; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2008-01-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the ‘omics’ technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical—failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  1. [Fundamental principles of social work--(also) a contribution to public health ethics].

    PubMed

    Lob-Hüdepohl, A

    2009-05-01

    Social work and public health are different but mutually connected. Both are professions with their own ethical foundations. Despite all differences, they have the same goal: to protect and to enhance the well-being of people. This is, in part, why the fundamental ethical principles of social work are salient for developing public health ethics. As a human rights profession, social work respects the personal autonomy of clients, supports solidarity-based relationships in families, groups or communities, and attempts to uphold social justice in society. Social workers need to adopt special professional attitudes: sensibility for the vulnerabilities of clients, care and attentiveness for their resources and strengths, assistance instead of paternalistic care and advocacy in decision making for clients' well-being when clients are not able to decide for themselves. These fundamental ethical principles are the basis for discussion of special topics of social work ethics as public health ethics, for example, in justifying intervention in individual lifestyles by public services without the participation or consent of the affected persons.

  2. Group theoretical derivation of the minimal coupling principle

    NASA Astrophysics Data System (ADS)

    Nisticò, Giuseppe

    2017-04-01

    The group theoretical methods worked out by Bargmann, Mackey and Wigner, which deductively establish the Quantum Theory of a free particle for which Galileian transformations form a symmetry group, are extended to the case of an interacting particle. In doing so, the obstacles caused by loss of symmetry are overcome. In this approach, specific forms of the wave equation of an interacting particle, including the equation derived from the minimal coupling principle, are implied by particular first-order invariance properties that characterize the interaction with respect to specific subgroups of Galileian transformations; moreover, the possibility of yet unknown forms of the wave equation is left open.

  3. Fundamental principles in periodontal plastic surgery and mucosal augmentation--a narrative review.

    PubMed

    Burkhardt, Rino; Lang, Niklaus P

    2014-04-01

    To provide a narrative review of the current literature elaborating on fundamental principles of periodontal plastic surgical procedures. Based on a presumptive outline of the narrative review, MESH terms have been used to search the relevant literature electronically in the PubMed and Cochrane Collaboration databases. If possible, systematic reviews were included. The review is divided into three phases associated with periodontal plastic surgery: a) pre-operative phase, b) surgical procedures and c) post-surgical care. The surgical procedures were discussed in the light of a) flap design and preparation, b) flap mobilization and c) flap adaptation and stabilization. Pre-operative paradigms include the optimal plaque control and smoking counselling. Fundamental principles in surgical procedures address basic knowledge in anatomy and vascularity, leading to novel appropriate flap designs with papilla preservation. Flap mobilization based on releasing incisions can be performed up to 5 mm. Flap adaptation and stabilization depend on appropriate wound bed characteristics, undisturbed blood clot formation, revascularization and wound stability through adequate suturing. Delicate tissue handling and tension free wound closure represent prerequisites for optimal healing outcomes. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Evidence for the principle of minimal frustration in the evolution of protein folding landscapes.

    PubMed

    Tzul, Franco O; Vasilchuk, Daniel; Makhatadze, George I

    2017-02-28

    Theoretical and experimental studies have firmly established that protein folding can be described by a funneled energy landscape. This funneled energy landscape is the result of foldable protein sequences evolving following the principle of minimal frustration, which allows proteins to rapidly fold to their native biologically functional conformations. For a protein family with a given functional fold, the principle of minimal frustration suggests that, independent of sequence, all proteins within this family should fold with similar rates. However, depending on the optimal living temperature of the organism, proteins also need to modulate their thermodynamic stability. Consequently, the difference in thermodynamic stability should be primarily caused by differences in the unfolding rates. To test this hypothesis experimentally, we performed comprehensive thermodynamic and kinetic analyses of 15 different proteins from the thioredoxin family. Eight of these thioredoxins were extant proteins from psychrophilic, mesophilic, or thermophilic organisms. The other seven protein sequences were obtained using ancestral sequence reconstruction and can be dated back over 4 billion years. We found that all studied proteins fold with very similar rates but unfold with rates that differ up to three orders of magnitude. The unfolding rates correlate well with the thermodynamic stability of the proteins. Moreover, proteins that unfold slower are more resistant to proteolysis. These results provide direct experimental support to the principle of minimal frustration hypothesis.

  5. [The input of medical community into development of fundamental principles of Zemstvo medicine of Russia].

    PubMed

    Yegorysheva, I V

    2013-01-01

    The article considers the participation of medical community in formation of fundamental principles of unique system of public health--the Zemstvo medicine. This occurrence found its reflexion in activities of medical scientific societies and congresses, periodic medical mass media.

  6. Fundamental Principles of Tremor Propagation in the Upper Limb.

    PubMed

    Davidson, Andrew D; Charles, Steven K

    2017-04-01

    Although tremor is the most common movement disorder, there exist few effective tremor-suppressing devices, in part because the characteristics of tremor throughout the upper limb are unknown. To clarify, optimally suppressing tremor requires a knowledge of the mechanical origin, propagation, and distribution of tremor throughout the upper limb. Here we present the first systematic investigation of how tremor propagates between the shoulder, elbow, forearm, and wrist. We simulated tremor propagation using a linear, time-invariant, lumped-parameter model relating joint torques and the resulting joint displacements. The model focused on the seven main degrees of freedom from the shoulder to the wrist and included coupled joint inertia, damping, and stiffness. We deliberately implemented a simple model to focus first on the most basic effects. Simulating tremorogenic joint torque as a sinusoidal input, we used the model to establish fundamental principles describing how input parameters (torque location and frequency) and joint impedance (inertia, damping, and stiffness) affect tremor propagation. We expect that the methods and principles presented here will serve as the groundwork for future refining studies to understand the origin, propagation, and distribution of tremor throughout the upper limb in order to enable the future development of optimal tremor-suppressing devices.

  7. Fundamental Principles of Tremor Propagation in the Upper Limb

    PubMed Central

    Davidson, Andrew D.; Charles, Steven K.

    2017-01-01

    Although tremor is the most common movement disorder, there exist few effective tremor-suppressing devices, in part because the characteristics of tremor throughout the upper limb are unknown. To clarify, optimally suppressing tremor requires a knowledge of the mechanical origin, propagation, and distribution of tremor throughout the upper limb. Here we present the first systematic investigation of how tremor propagates between the shoulder, elbow, forearm, and wrist. We simulated tremor propagation using a linear, time-invariant, lumped-parameter model relating joint torques and the resulting joint displacements. The model focused on the seven main degrees of freedom from the shoulder to the wrist and included coupled joint inertia, damping, and stiffness. We deliberately implemented a simple model to focus first on the most basic effects. Simulating tremorogenic joint torque as a sinusoidal input, we used the model to establish fundamental principles describing how input parameters (torque location and frequency) and joint impedance (inertia, damping, and stiffness) affect tremor propagation. We expect that the methods and principles presented here will serve as the groundwork for future refining studies to understand the origin, propagation, and distribution of tremor throughout the upper limb in order to enable the future development of optimal tremor-suppressing devices. PMID:27957608

  8. The Minimal Control Principle Predicts Strategy Shifts in the Abstract Decision Making Task

    ERIC Educational Resources Information Center

    Taatgen, Niels A.

    2011-01-01

    The minimal control principle (Taatgen, 2007) predicts that people strive for problem-solving strategies that require as few internal control states as possible. In an experiment with the Abstract Decision Making task (ADM task; Joslyn & Hunt, 1998) the reward structure was manipulated to make either a low-control strategy or a high-strategy…

  9. Effective Techniques for Augmenting Heat Transfer: An Application of Entropy Generation Minimization Principles.

    DTIC Science & Technology

    1980-12-01

    augmentation techniques, entropy generation, irreversibility, exergy . 20. ABSTRACT (Continue on rovers. side If necessary and Identify by block number...35 3.5 Internally finned tubes ...... ................. .. 37 3.6 Internally roughened tubes ..... ............... . 41 3.7 Other heat transfer...irreversibility and entropy generation as fundamental criterion for evaluating and, eventually, minimizing the waste of usable energy ( exergy ) in energy

  10. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    PubMed

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-09-28

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  11. 3 CFR 13559 - Executive Order 13559 of November 17, 2010. Fundamental Principles and Policymaking Criteria for...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (vi) the Secretary of Housing and Urban Development; (vii) the Secretary of Education; (viii) the... social service programs or that support (including through prime awards or sub-awards) social service... following fundamental principles: (a) Federal financial assistance for social service programs should be...

  12. Evolutionary principles and their practical application

    PubMed Central

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966

  13. Evolutionary principles and their practical application.

    PubMed

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-03-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.

  14. Minimal Length Scale Scenarios for Quantum Gravity.

    PubMed

    Hossenfelder, Sabine

    2013-01-01

    We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  15. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerner, Boris S.

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (formore » example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.« less

  16. Safe use of cellular telephones in hospitals: fundamental principles and case studies.

    PubMed

    Cohen, Ted; Ellis, Willard S; Morrissey, Joseph J; Bakuzonis, Craig; David, Yadin; Paperman, W David

    2005-01-01

    Many industries and individuals have embraced cellular telephones. They provide mobile, synchronous communication, which could hypothetically increase the efficiency and safety of inpatient healthcare. However, reports of early analog cellular telephones interfering with critical life-support machines had led many hospitals to strictly prohibit cellular telephones. A literature search revealed that individual hospitals now are allowing cellular telephone use with various policies to prevent electromagnetic interference with medical devices. The fundamental principles underlying electromagnetic interference are immunity, frequency, modulation technology, distance, and power Electromagnetic interference risk mitigation methods based on these principles have been successfully implemented. In one case study, a minimum distance between cellular telephones and medical devices is maintained, with restrictions in critical areas. In another case study, cellular telephone coverage is augmented to automatically control the power of the cellular telephone. While no uniform safety standard yet exists, cellular telephones can be safely used in hospitals when their use is managed carefully.

  17. Fundamental principles of conducting a surgery economic analysis study.

    PubMed

    Kotsis, Sandra V; Chung, Kevin C

    2010-02-01

    The use of economic evaluation in surgery is scarce. Economic evaluation is used even less so in plastic surgery, in which health-related quality of life is of particular importance. This article, part of a tutorial series on evidence-based medicine, focuses on the fundamental principles of conducting a surgery economic analysis. The authors include the essential aspects of conducting a surgical cost-utility analysis by considering perspectives, costs, outcomes, and utilities. The authors also describe and give examples of how to conduct the analyses (including calculating quality-adjusted life-years and discounting), how to interpret the results, and how to report the results. Although economic analyses are not simple to conduct, a well-conducted one provides many rewards, such as recommending the adoption of a more effective treatment. For comparing and interpreting economic analysis publications, it is important that all studies use consistent methodology and report the results in a similar manner.

  18. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  19. Research on the fundamental principles of China's marine invasive species prevention legislation.

    PubMed

    Bai, Jiayu

    2014-12-15

    China's coastal area is severely damaged by marine invasive species. Traditional tort theory resolves issues relevant to property damage or personal injuries, through which plaintiffs cannot cope with the ecological damage caused by marine invasive species. Several defects exist within the current legal regimes, such as imperfect management systems, insufficient unified technical standards, and unsound legal responsibility systems. It is necessary to pass legislation to prevent the ecological damage caused by marine invasive species. This investigation probes the fundamental principles needed for the administration and legislation of an improved legal framework to combat the problem of invasive species within China's coastal waters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. A defense of fundamental principles and human rights: a reply to Robert Baker.

    PubMed

    Macklin, Ruth

    1998-12-01

    This article seeks to rebut Robert Baker's contention that attempts to ground international bioethics in fundamental principles cannot withstand the challenges posed by multiculturalism and postmodernism. First, several corrections are provided of Baker's account of the conclusions reached by the Advisory Committee on Human Radiation Experiments. Second, a rebuttal is offered to Baker's claim that an unbridgeable moral gap exists between Western individualism and non-Western communalism. In conclusion, this article argues that Baker's "nonnegotiable primary goods" cannot do the work of "classical human rights" and that the latter framework is preferable from both a practical and a theoretical standpoint.

  1. Multifluid cosmology: An illustration of fundamental principles

    NASA Astrophysics Data System (ADS)

    Comer, G. L.; Peter, Patrick; Andersson, N.

    2012-05-01

    Our current understanding of the Universe depends on the interplay of several distinct matter components, which interact mainly through gravity, and electromagnetic radiation. The nature of the different components, and possible interactions, tends to be based on the notion of coupled perfect fluids (or scalar fields). This approach is somewhat naive, especially if one wants to be able to consider issues involving heat flow, dissipative mechanisms, or Bose-Einstein condensation of dark matter. We argue that a more natural starting point would be the multipurpose variational relativistic multifluid system that has so far mainly been applied to neutron star astrophysics. As an illustration of the fundamental principles involved, we develop the formalism for determining the nonlinear cosmological solutions to the Einstein equations for a general relativistic two-fluid model for a coupled system of matter (nonzero rest mass) and radiation (zero rest mass). The two fluids are allowed to interpenetrate and exhibit a relative flow with respect to each other, implying, in general, an anisotropic Universe. We use initial conditions such that the massless fluid flux dominates early on so that the situation is effectively that of a single-fluid and one has the usual Friedmann-Lemaître-Robertson-Walker spacetime. We find that there is a Bianchi I transition epoch out of which the matter flux dominates. The situation is then effectively that of a single fluid and the spacetime evolves towards the Friedmann-Lemaître-Robertson-Walker form. Such a transition opens up the possibility of imprinting observable consequences at the specific scale corresponding to the transition time.

  2. Fundamental Assumptions and Aims Underlying the Principles and Policies of Federal Financial Aid to Students. Research Report.

    ERIC Educational Resources Information Center

    Johnstone, D. Bruce

    As background to the National Dialogue on Student Financial Aid, this essay discusses the fundamental assumptions and aims that underlie the principles and policies of federal financial aid to students. These eight assumptions and aims are explored: (1) higher education is the province of states, and not of the federal government; (2) the costs of…

  3. Minimal metabolic pathway structure is consistent with associated biomolecular interactions

    PubMed Central

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  4. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  5. Fundamentals of fluid lubrication

    NASA Technical Reports Server (NTRS)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  6. Fundamentals and Recent Developments in Approximate Bayesian Computation

    PubMed Central

    Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka

    2017-01-01

    Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922

  7. Fundamentals of Diesel Engines.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  8. How You Store Information Affects How You Can Retrieve It: A Fundamental Principle for Business Students Studying Information Systems and Technology

    ERIC Educational Resources Information Center

    Silver, Mark S.

    2017-01-01

    During the current period of rapid technological change, business students need to emerge from their introductory course in Information Systems (IS) with a set of fundamental principles to help them "think about Information Technology (IT)" in future courses and the workplace. Given the digital revolution, they also need to appreciate…

  9. Variation of fundamental constants on sub- and super-Hubble scales: From the equivalence principle to the multiverse

    NASA Astrophysics Data System (ADS)

    Uzan, Jean-Philippe

    2013-02-01

    Fundamental constants play a central role in many modern developments in gravitation and cosmology. Most extensions of general relativity lead to the conclusion that dimensionless constants are actually dynamical fields. Any detection of their variation on sub-Hubble scales would signal a violation of the Einstein equivalence principle and hence a lead to gravity beyond general relativity. On super-Hubble scales, or maybe should we say on super-universe scales, such variations are invoked as a solution to the fine-tuning problem, in connection with an anthropic approach.

  10. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark).

  11. [Participation of forensic medical experts in the examination at the place of occurrence: fundamental legal principles and organizational aspects].

    PubMed

    Kachina, N N

    2013-01-01

    Discussed in this paper are fundamental legal principles and organizational aspects of the participation of forensic medical experts in the examination of the corpses at the place of occurrence. A detailed analysis of the current departmental and sectoral regulations governing the activities of specialists in the field of forensic medicine was performed The analysis demonstrated their positive and negative aspects. These findings were used to develop concrete recommendations for further improvement of these documents.

  12. The principle of minimal episteric distortion of the water matrix and its steering role in protein folding

    NASA Astrophysics Data System (ADS)

    Fernández, Ariel

    2013-08-01

    A significant episteric ("around a solid") distortion of the hydrogen-bond structure of water is promoted by solutes with nanoscale surface detail and physico-chemical complexity, such as soluble natural proteins. These structural distortions defy analysis because the discrete nature of the solvent at the interface is not upheld by the continuous laws of electrostatics. This work derives and validates an electrostatic equation that governs the episteric distortions of the hydrogen-bond matrix. The equation correlates distortions from bulk-like structural patterns with anomalous polarization components that do not align with the electrostatic field of the solute. The result implies that the interfacial energy stored in the orthogonal polarization correlates with the distortion of the water hydrogen-bond network. The result is validated vis-à-vis experimental data on protein interfacial thermodynamics and is interpreted in terms of the interaction energy between the electrostatic field of the solute and the dipole moment induced by the anomalous polarization of interfacial water. Finally, we consider solutes capable of changing their interface through conformational transitions and introduce a principle of minimal episteric distortion (MED) of the water matrix. We assess the importance of the MED principle in the context of protein folding, concluding that the native fold may be identified topologically with the conformation that minimizes the interfacial tension or disruption of the water matrix.

  13. Fundamental principles of data assimilation underlying the Verdandi library: applications to biophysical model personalization within euHeart.

    PubMed

    Chapelle, D; Fragu, M; Mallet, V; Moireau, P

    2013-11-01

    We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.

  14. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    PubMed

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Mathematical Optimization Algorithm for Minimizing the Cost Function of GHG Emission in AS/RS Using Positive Selection Based Clonal Selection Principle

    NASA Astrophysics Data System (ADS)

    Mahalakshmi; Murugesan, R.

    2018-04-01

    This paper regards with the minimization of total cost of Greenhouse Gas (GHG) efficiency in Automated Storage and Retrieval System (AS/RS). A mathematical model is constructed based on tax cost, penalty cost and discount cost of GHG emission of AS/RS. A two stage algorithm namely positive selection based clonal selection principle (PSBCSP) is used to find the optimal solution of the constructed model. In the first stage positive selection principle is used to reduce the search space of the optimal solution by fixing a threshold value. In the later stage clonal selection principle is used to generate best solutions. The obtained results are compared with other existing algorithms in the literature, which shows that the proposed algorithm yields a better result compared to others.

  16. The minimal work cost of information processing

    NASA Astrophysics Data System (ADS)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  17. [Fundamental ethical principles in the European framework programmes for research and development].

    PubMed

    Hirsch, François; Karatzas, Isidoros; Zilgalvis, Pēteris

    2009-01-01

    The European Commission is one of the most important international funding bodies for research conducted in Europe and beyond, including developing countries and countries in transition. Through its framework programmes for research and development, the European Union finances a vast array of projects concerning fields affecting the citizens' health, as well as the researchers' mobility, the development of new technologies or the safeguard of the environment. With the agreement of the European Parliament and of the Council of Ministers, the two decisional authorities of the European Union, the 7th framework programmes was started on December 2006. This program has a budget of 54 billion Euros to be distributed over a 7-year period. Therefore, the European Union aims to fully address the challenge as stated by the European Council of Lisbon (of March 2000) which declared the idea of providing 3% of the GDP of all the Member States for the purpose of research and development. One of the important conditions stated by the Members of the European Parliament to allocate this financing is to ensuring that "the funding research activities respect the fundamental ethical principles". In this article, we will approach this aspect of the evaluation.

  18. Fundamental electrode kinetics

    NASA Technical Reports Server (NTRS)

    Elder, J. P.

    1968-01-01

    Report presents the fundamentals of electrode kinetics and the methods used in evaluating the characteristic parameters of rapid-charge transfer processes at electrode-electrolyte interfaces. The concept of electrode kinetics is outlined, followed by the principles underlying the experimental techniques for the investigation of electrode kinetics.

  19. Minimal measures for Euler-Lagrange flows on finite covering spaces

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Xia, Zhihong

    2016-12-01

    In this paper we study the minimal measures for positive definite Lagrangian systems on compact manifolds. We are particularly interested in manifolds with more complicated fundamental groups. Mather’s theory classifies the minimal or action-minimizing measures according to the first (co-)homology group of a given manifold. We extend Mather’s notion of minimal measures to a larger class for compact manifolds with non-commutative fundamental groups, and use finite coverings to study the structure of these extended minimal measures. We also define action-minimizers and minimal measures in the homotopical sense. Our program is to study the structure of homotopical minimal measures by considering Mather’s minimal measures on finite covering spaces. Our goal is to show that, in general, manifolds with a non-commutative fundamental group have a richer set of minimal measures, hence a richer dynamical structure. As an example, we study the geodesic flow on surfaces of higher genus. Indeed, by going to the finite covering spaces, the set of minimal measures is much larger and more interesting.

  20. Fundamental Ethical Principles in Sports Medicine.

    PubMed

    Devitt, Brian M

    2016-04-01

    In sports medicine, the practice of ethics presents many unique challenges because of the unusual clinical environment of caring for players within the context of a team whose primary goal is to win. Ethical issues frequently arise because a doctor-patient-team triad often replaces the traditional doctor-patient relationship. Conflict may exist when the team's priority clashes with or even replaces the doctor's obligation to player well-being. Customary ethical norms that govern most forms of clinical practice, such as autonomy and confidentiality, are not easily translated to sports medicine. Ethical principles and examples of how they relate to sports medicine are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle.

  2. Fundamental uncertainty limit for speckle displacement measurements.

    PubMed

    Fischer, Andreas

    2017-09-01

    The basic metrological task in speckle photography is to quantify displacements of speckle patterns, allowing for instance the investigation of the mechanical load and modification of objects with rough surfaces. However, the fundamental limit of the measurement uncertainty due to photon shot noise is unknown. For this reason, the Cramér-Rao bound (CRB) is derived for speckle displacement measurements, representing the squared minimal achievable measurement uncertainty. As result, the CRB for speckle patterns is only two times the CRB for an ideal point light source. Hence, speckle photography is an optimal measurement approach for contactless displacement measurements on rough surfaces. In agreement with a derivation from Heisenberg's uncertainty principle, the CRB depends on the number of detected photons and the diffraction limit of the imaging system described by the speckle size. The theoretical results are verified and validated, demonstrating the capability for displacement measurements with nanometer resolution.

  3. A Minimal Model Describing Hexapedal Interlimb Coordination: The Tegotae-Based Approach

    PubMed Central

    Owaki, Dai; Goda, Masashi; Miyazawa, Sakiko; Ishiguro, Akio

    2017-01-01

    Insects exhibit adaptive and versatile locomotion despite their minimal neural computing. Such locomotor patterns are generated via coordination between leg movements, i.e., an interlimb coordination, which is largely controlled in a distributed manner by neural circuits located in thoracic ganglia. However, the mechanism responsible for the interlimb coordination still remains elusive. Understanding this mechanism will help us to elucidate the fundamental control principle of animals' agile locomotion and to realize robots with legs that are truly adaptive and could not be developed solely by conventional control theories. This study aims at providing a “minimal" model of the interlimb coordination mechanism underlying hexapedal locomotion, in the hope that a single control principle could satisfactorily reproduce various aspects of insect locomotion. To this end, we introduce a novel concept we named “Tegotae,” a Japanese concept describing the extent to which a perceived reaction matches an expectation. By using the Tegotae-based approach, we show that a surprisingly systematic design of local sensory feedback mechanisms essential for the interlimb coordination can be realized. We also use a hexapod robot we developed to show that our mathematical model of the interlimb coordination mechanism satisfactorily reproduces various insects' gait patterns. PMID:28649197

  4. Low dose CT reconstruction via L1 norm dictionary learning using alternating minimization algorithm and balancing principle.

    PubMed

    Wu, Junfeng; Dai, Fang; Hu, Gang; Mou, Xuanqin

    2018-04-18

    Excessive radiation exposure in computed tomography (CT) scans increases the chance of developing cancer and has become a major clinical concern. Recently, statistical iterative reconstruction (SIR) with l0-norm dictionary learning regularization has been developed to reconstruct CT images from the low dose and few-view dataset in order to reduce radiation dose. Nonetheless, the sparse regularization term adopted in this approach is l0-norm, which cannot guarantee the global convergence of the proposed algorithm. To address this problem, in this study we introduced the l1-norm dictionary learning penalty into SIR framework for low dose CT image reconstruction, and developed an alternating minimization algorithm to minimize the associated objective function, which transforms CT image reconstruction problem into a sparse coding subproblem and an image updating subproblem. During the image updating process, an efficient model function approach based on balancing principle is applied to choose the regularization parameters. The proposed alternating minimization algorithm was evaluated first using real projection data of a sheep lung CT perfusion and then using numerical simulation based on sheep lung CT image and chest image. Both visual assessment and quantitative comparison using terms of root mean square error (RMSE) and structural similarity (SSIM) index demonstrated that the new image reconstruction algorithm yielded similar performance with l0-norm dictionary learning penalty and outperformed the conventional filtered backprojection (FBP) and total variation (TV) minimization algorithms.

  5. Homeschooling and Religious Fundamentalism

    ERIC Educational Resources Information Center

    Kunzman, Robert

    2010-01-01

    This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…

  6. Quantum correlations are tightly bound by the exclusivity principle.

    PubMed

    Yan, Bin

    2013-06-28

    It is a fundamental problem in physics of what principle limits the correlations as predicted by our current description of nature, based on quantum mechanics. One possible explanation is the "global exclusivity" principle recently discussed in Phys. Rev. Lett. 110, 060402 (2013). In this work we show that this principle actually has a much stronger restriction on the probability distribution. We provide a tight constraint inequality imposed by this principle and prove that this principle singles out quantum correlations in scenarios represented by any graph. Our result implies that the exclusivity principle might be one of the fundamental principles of nature.

  7. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    USDA-ARS?s Scientific Manuscript database

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  8. Fundamental Principles of Cancer Biology: Does it have relevance to the perioperative period?

    PubMed

    Jiang, Li; Nick, Alpa M; Sood, Anil K

    2015-09-01

    Malignant tumors are characterized by their ability to metastasize, which is the main cause of cancer-related mortality. Besides intrinsic alternations in cancer cells, the tumor microenvironment plays a pivotal role in tumor growth and metastasis. Ample evidence suggests that the perioperative period and the excision of the primary tumor can promote the development of metastases and can influence long-term cancer patient outcomes. The role of cancer biology and its impact on the perioperative period are of increasing interest. This review will present evidence regarding fundamental principles of cancer biology, especially tumor microenvironment, and discuss new therapeutic opportunities in the perioperative timeframe. We will also discuss the regulatory signaling that could be relevant to various aspects of surgery and surgical responses, which could facilitate the metastatic process by directly or indirectly affecting malignant tissues and the tumor microenvironment. We address the influences of surgery-related stress, anesthetic and analgesic agents, blood transfusion, hypothermia, and β-adrenergic blockade administration on tumor growth and metastasis. Through an improved understanding of these processes, we will provide suggestions for potential new perioperative approaches aimed at improving treatment outcomes of cancer patients.

  9. STEP and fundamental physics

    NASA Astrophysics Data System (ADS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  10. Sensors, Volume 1, Fundamentals and General Aspects

    NASA Astrophysics Data System (ADS)

    Grandke, Thomas; Ko, Wen H.

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.

  11. Ethical principles in health research and review process.

    PubMed

    Tangwa, Godfrey B

    2009-11-01

    In this paper I want to reflect on the fundamental ethical principles and their application in different particular contexts, especially in health research and the ethics review process. Four fundamental ethical principles have been identified and widely discussed in bioethical literature. These principles namely are: autonomy or respect for others, beneficence, non-maleficence and justice. These principles have cross-cultural validity, relevance and applicability. Every real-life situation and every concrete particular case in which ethical decision-making is called-for is unique and different from all others; but the same fundamental ethical principles are relevant and used in addressing all such cases and situations. Very often ethical problems will present themselves in the form of dilemmas and it is then necessary to use the same fundamental principles to analyze the situations, to argue persuasively and cogently with competence for the best options or choices in such situations. The issues I will be dealing with in this paper are necessarily more abstract and theoretical, but we will be discussing them from a very practical viewpoint and impulse, with a view to application in concrete real-life situations. The paper ends with some sample practical examples of cases that the reader can use to test his/her grasp of the principles, how to apply them, how to balance them in differing situations and contexts and how to adjudicate between them when they seem to be in conflict.

  12. The 4th Thermodynamic Principle?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulationmore » of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.« less

  13. Achieving sustainable plant disease management through evolutionary principles.

    PubMed

    Zhan, Jiasui; Thrall, Peter H; Burdon, Jeremy J

    2014-09-01

    Plants and their pathogens are engaged in continuous evolutionary battles and sustainable disease management requires novel systems to create environments conducive for short-term and long-term disease control. In this opinion article, we argue that knowledge of the fundamental factors that drive host-pathogen coevolution in wild systems can provide new insights into disease development in agriculture. Such evolutionary principles can be used to guide the formulation of sustainable disease management strategies which can minimize disease epidemics while simultaneously reducing pressure on pathogens to evolve increased infectivity and aggressiveness. To ensure agricultural sustainability, disease management programs that reflect the dynamism of pathogen population structure are essential and evolutionary biologists should play an increasing role in their design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.

    PubMed

    Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A

    2017-10-20

    Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.

  15. The Fundamentals of an African American Value System.

    ERIC Educational Resources Information Center

    Alexander, E. Curtis

    The Nguzo Saba or "Seven Principles of Blackness" provide the fundamental basis for the development of an African America value system that is based on the cultural and historical particularisms of being Black in an American society that devalues Black efficacy and Black people. The fundamentals of this value system, foundational to the Kwanzaa…

  16. Fundamentals of Geophysics

    NASA Astrophysics Data System (ADS)

    Lowrie, William

    1997-10-01

    This unique textbook presents a comprehensive overview of the fundamental principles of geophysics. Unlike most geophysics textbooks, it combines both the applied and theoretical aspects to the subject. The author explains complex geophysical concepts using abundant diagrams, a simplified mathematical treatment, and easy-to-follow equations. After placing the Earth in the context of the solar system, he describes each major branch of geophysics: gravitation, seismology, dating, thermal and electrical properties, geomagnetism, paleomagnetism and geodynamics. Each chapter begins with a summary of the basic physical principles, and a brief account of each topic's historical evolution. The book will satisfy the needs of intermediate-level earth science students from a variety of backgrounds, while at the same time preparing geophysics majors for continued study at a higher level.

  17. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  18. Emergent features and perceptual objects: re-examining fundamental principles in analogical display design.

    PubMed

    Holt, Jerred; Bennett, Kevin B; Flach, John M

    2015-01-01

    Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays.

  19. The Inactivation Principle: Mathematical Solutions Minimizing the Absolute Work and Biological Implications for the Planning of Arm Movements

    PubMed Central

    Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul

    2008-01-01

    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality

  20. Elementary Concepts and Fundamental Laws of the Theory of Heat

    NASA Astrophysics Data System (ADS)

    de Oliveira, Mário J.

    2018-06-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  1. Elementary Concepts and Fundamental Laws of the Theory of Heat

    NASA Astrophysics Data System (ADS)

    de Oliveira, Mário J.

    2018-03-01

    The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.

  2. Making the Most of Minimalism in Music.

    ERIC Educational Resources Information Center

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  3. The Subordination of Aesthetic Fundamentals in College Art Instruction

    ERIC Educational Resources Information Center

    Lavender, Randall

    2003-01-01

    Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…

  4. Fundamental principles in bacterial physiology—history, recent progress, and the future with focus on cell size control: a review

    NASA Astrophysics Data System (ADS)

    Jun, Suckjoon; Si, Fangwei; Pugatch, Rami; Scott, Matthew

    2018-05-01

    Bacterial physiology is a branch of biology that aims to understand overarching principles of cellular reproduction. Many important issues in bacterial physiology are inherently quantitative, and major contributors to the field have often brought together tools and ways of thinking from multiple disciplines. This article presents a comprehensive overview of major ideas and approaches developed since the early 20th century for anyone who is interested in the fundamental problems in bacterial physiology. This article is divided into two parts. In the first part (sections 1–3), we review the first ‘golden era’ of bacterial physiology from the 1940s to early 1970s and provide a complete list of major references from that period. In the second part (sections 4–7), we explain how the pioneering work from the first golden era has influenced various rediscoveries of general quantitative principles and significant further development in modern bacterial physiology. Specifically, section 4 presents the history and current progress of the ‘adder’ principle of cell size homeostasis. Section 5 discusses the implications of coarse-graining the cellular protein composition, and how the coarse-grained proteome ‘sectors’ re-balance under different growth conditions. Section 6 focuses on physiological invariants, and explains how they are the key to understanding the coordination between growth and the cell cycle underlying cell size control in steady-state growth. Section 7 overviews how the temporal organization of all the internal processes enables balanced growth. In the final section 8, we conclude by discussing the remaining challenges for the future in the field.

  5. Lessons that Bear Repeating and Repeating that Bears Lessons: An Interdisciplinary Unit on Principles of Minimalism in Modern Music, Art, and Poetry (Grades 4-8)

    ERIC Educational Resources Information Center

    Smigel, Eric; McDonald, Nan L.

    2012-01-01

    This theory-to-practice article focuses on interdisciplinary classroom activities based on principles of minimalism in modern music, art, and poetry. A lesson sequence was designed for an inner-city Grades 4 and 5 general classroom of English language learners, where the unit was taught, assessed, and documented by the authors. Included in the…

  6. Complementary Huygens Principle for Geometrical and Nongeometrical Optics

    ERIC Educational Resources Information Center

    Luis, Alfredo

    2007-01-01

    We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…

  7. Microscopic Description of Le Chatelier's Principle

    ERIC Educational Resources Information Center

    Novak, Igor

    2005-01-01

    A simple approach that "demystifies" Le Chatelier's principle (LCP) and simulates students to think about fundamental physical background behind the well-known principles is presented. The approach uses microscopic descriptors of matter like energy levels and populations and does not require any assumption about the fixed amount of substance being…

  8. Biofiltration: Fundamentals, design and operations principles and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swanson, W.J.; Loehr, R.C.

    1997-06-01

    Biofiltration is a biological air pollution control technology for volatile organic compounds (VOCs). This paper summarizes the fundamentals, design and operation, and application of the process. Biofiltration has been demonstrated to be an effective technology for VOCs from many industries. Large and full-scale systems are in use in Europe and the US. With proper design and operation, VOC removal efficiencies of 95--99% have been achieved. Important parameters for design and performance are empty-bed contact time, gas surface loading, mass loading, elimination capacity, and removal efficiency. Key design and operation factors include chemical and media properties, moisture, pH, temperature, nutrient availability,more » gas pretreatment, and variations in loading.« less

  9. Detection principle of gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Congedo, Giuseppe

    With the first two detections in late 2015, astrophysics has officially entered into the new era of gravitational wave (GW) observations. Since then, much has been going on in the field with a lot of work focusing on the observations and implications for astrophysics and tests of general relativity in the strong regime. However, much less is understood about how gravitational detectors really work at their fundamental level. For decades, the response to incoming signals has been customarily calculated using the very same physical principle, which has proved so successful in the first detections. In this paper, we review the physical principle that is behind such a detection at the very fundamental level, and we try to highlight the peculiar subtleties that make it so hard in practice. We will then mention how detectors are built starting from this fundamental measurement element.

  10. Minimally invasive surgery. Future developments.

    PubMed

    Wickham, J E

    1994-01-15

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures.

  11. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  12. ``From Fundamental Motives to Rational Expectation Equilibrium[REE, henceworth] of Indeterminacy''

    NASA Astrophysics Data System (ADS)

    Maksoed, Ssi, Wh-

    For ``Principle of Indeterminacy''from Heisenberg states: ``one of the fundamental cornerstone of quantum mechanics is the Heisenberg uncertainty principle''.whereas canonically conjugate quantities can be determined simultaneously only with a characteristic indeterminacy[M. Arevalo Aguilar, et.al]. Accompanying Alfred North Whitehead conclusion in ``The Aims of Education''that mathematical symbols are artificial before new meanings given, two kinds of fundamental motives: (i) expectation-expectation, (ii) expectation-certainty inherently occurs with determinacy properties of rational expectation equilibrium(REE, henceworth)- Guido Ascari & Tizano Ropele:''Trend inflation, Taylor principle & Indeterminacy'', Kiel Institute, June 2007. Furthers, relative price expression can be compare of their α and (1 - α) configurations in the expression of possible activity. Acknowledgment to Prof[asc]. Dr. Bobby Eka Gunara for ``made a rank through physics'' denotes...

  13. Black hole complementarity with the generalized uncertainty principle in Gravity's Rainbow

    NASA Astrophysics Data System (ADS)

    Gim, Yongwan; Um, Hwajin; Kim, Wontae

    2018-02-01

    When gravitation is combined with quantum theory, the Heisenberg uncertainty principle could be extended to the generalized uncertainty principle accompanying a minimal length. To see how the generalized uncertainty principle works in the context of black hole complementarity, we calculate the required energy to duplicate information for the Schwarzschild black hole. It shows that the duplication of information is not allowed and black hole complementarity is still valid even assuming the generalized uncertainty principle. On the other hand, the generalized uncertainty principle with the minimal length could lead to a modification of the conventional dispersion relation in light of Gravity's Rainbow, where the minimal length is also invariant as well as the speed of light. Revisiting the gedanken experiment, we show that the no-cloning theorem for black hole complementarity can be made valid in the regime of Gravity's Rainbow on a certain combination of parameters.

  14. Minimally invasive surgery. Future developments.

    PubMed Central

    Wickham, J. E.

    1994-01-01

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures. Images Fig 1 Fig 2 Fig 3 Fig 4 Fig 5 PMID:8312776

  15. Driving an Active Vibration Balancer to Minimize Vibrations at the Fundamental and Harmonic Frequencies

    NASA Technical Reports Server (NTRS)

    Holliday, Ezekiel S. (Inventor)

    2014-01-01

    Vibrations of a principal machine are reduced at the fundamental and harmonic frequencies by driving the drive motor of an active balancer with balancing signals at the fundamental and selected harmonics. Vibrations are sensed to provide a signal representing the mechanical vibrations. A balancing signal generator for the fundamental and for each selected harmonic processes the sensed vibration signal with adaptive filter algorithms of adaptive filters for each frequency to generate a balancing signal for each frequency. Reference inputs for each frequency are applied to the adaptive filter algorithms of each balancing signal generator at the frequency assigned to the generator. The harmonic balancing signals for all of the frequencies are summed and applied to drive the drive motor. The harmonic balancing signals drive the drive motor with a drive voltage component in opposition to the vibration at each frequency.

  16. Promoting patient-centred fundamental care in acute healthcare systems.

    PubMed

    Feo, Rebecca; Kitson, Alison

    2016-05-01

    Meeting patients' fundamental care needs is essential for optimal safety and recovery and positive experiences within any healthcare setting. There is growing international evidence, however, that these fundamentals are often poorly executed in acute care settings, resulting in patient safety threats, poorer and costly care outcomes, and dehumanising experiences for patients and families. Whilst care standards and policy initiatives are attempting to address these issues, their impact has been limited. This discussion paper explores, through a series of propositions, why fundamental care can be overlooked in sophisticated, high technology acute care settings. We argue that the central problem lies in the invisibility and subsequent devaluing of fundamental care. Such care is perceived to involve simple tasks that require little skill to execute and have minimal impact on patient outcomes. The propositions explore the potential origins of this prevailing perception, focusing upon the impact of the biomedical model, the consequences of managerial approaches that drive healthcare cultures, and the devaluing of fundamental care by nurses themselves. These multiple sources of invisibility and devaluing surrounding fundamental care have rendered the concept underdeveloped and misunderstood both conceptually and theoretically. Likewise, there remains minimal role clarification around who should be responsible for and deliver such care, and a dearth of empirical evidence and evidence-based metrics. In explicating these propositions, we argue that key to transforming the delivery of acute healthcare is a substantial shift in the conceptualisation of fundamental care. The propositions present a cogent argument that counters the prevailing perception that fundamental care is basic and does not require systematic investigation. We conclude by calling for the explicit valuing and embedding of fundamental care in healthcare education, research, practice and policy. Without this

  17. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  18. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  19. Fundamental monogamy relation between contextuality and nonlocality.

    PubMed

    Kurzyński, Paweł; Cabello, Adán; Kaszlikowski, Dagomir

    2014-03-14

    We show that the no-disturbance principle imposes a tradeoff between locally contextual correlations violating the Klyachko-Can-Biniciogˇlu-Shumovski inequality and spatially separated correlations violating the Clauser-Horne-Shimony-Holt inequality. The violation of one inequality forbids the violation of the other. We also obtain the corresponding monogamy relation imposed by quantum theory for a qutrit-qubit system. Our results show the existence of fundamental monogamy relations between contextuality and nonlocality that suggest that entanglement might be a particular form of a more fundamental resource.

  20. Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle

    NASA Astrophysics Data System (ADS)

    Kamenshchik, A. Yu.; Teryaev, O. V.

    2008-10-01

    We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.

  1. [The anthropic principle in biology and radiobiology].

    PubMed

    Akif'ev, A P; Degtiarev, S V

    1999-01-01

    In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism.

  2. Systems Biology Perspectives on Minimal and Simpler Cells

    PubMed Central

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  3. Systems biology perspectives on minimal and simpler cells.

    PubMed

    Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel

    2014-09-01

    The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  4. The simplicity principle in perception and cognition

    PubMed Central

    Feldman, Jacob

    2016-01-01

    The simplicity principle, traditionally referred to as Occam’s razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations— or, more precisely, that it balances a bias towards simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. PMID:27470193

  5. Emergent gravity of fractons: Mach's principle revisited

    NASA Astrophysics Data System (ADS)

    Pretko, Michael

    2017-07-01

    Recent work has established the existence of stable quantum phases of matter described by symmetric tensor gauge fields, which naturally couple to particles of restricted mobility, such as fractons. We focus on a minimal toy model of a rank 2 tensor gauge field, consisting of fractons coupled to an emergent graviton (massless spin-2 excitation). We show how to reconcile the immobility of fractons with the expected gravitational behavior of the model. First, we reformulate the fracton phenomenon in terms of an emergent center of mass quantum number, and we show how an effective attraction arises from the principles of locality and conservation of center of mass. This interaction between fractons is always attractive and can be recast in geometric language, with a geodesiclike formulation, thereby satisfying the expected properties of a gravitational force. This force will generically be short-ranged, but we discuss how the power-law behavior of Newtonian gravity can arise under certain conditions. We then show that, while an isolated fracton is immobile, fractons are endowed with finite inertia by the presence of a large-scale distribution of other fractons, in a concrete manifestation of Mach's principle. Our formalism provides suggestive hints that matter plays a fundamental role, not only in perturbing, but in creating the background space in which it propagates.

  6. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  7. Principles of a clean operating room environment.

    PubMed

    Howard, James L; Hanssen, Arlen D

    2007-10-01

    Optimizing the operating room environment is necessary to minimize the prevalence of arthroplasty infection. Reduction of bacterial contamination in the operating room should be a primary focus of all members of the operating room team. However, in recent years, there has been a decline in the emphasis of the basic principles of antisepsis in many operating rooms. The purpose of this review is to highlight important considerations for optimizing the operating room environment. These principles should be actively promoted by orthopedic surgeons in their operating rooms as part of a comprehensive approach to minimizing arthroplasty infection.

  8. Contemporary treatment principles for early rheumatoid arthritis: a consensus statement.

    PubMed

    Kiely, Patrick D W; Brown, Andrew K; Edwards, Christopher J; O'Reilly, David T; Ostör, Andrew J K; Quinn, Mark; Taggart, Allister; Taylor, Peter C; Wakefield, Richard J; Conaghan, Philip G

    2009-07-01

    RA has a substantial impact on both patients and healthcare systems. Our objective is to advance the understanding of modern management principles in light of recent evidence concerning the condition's diagnosis and treatment. A group of practicing UK rheumatologists formulated contemporary management principles and clinical practice recommendations concerning both diagnosis and treatment. Areas of clinical uncertainty were documented, leading to research recommendations. A fundamental concept governing treatment of RA is minimization of cumulative inflammation, referred to as the inflammation-time area under the curve (AUC). To achieve this, four core principles of management were identified: (i) detect and refer patients early, even if the diagnosis is uncertain: patients should be referred at the first suspicion of persistent inflammatory polyarthritis and rheumatology departments should provide rapid access to a diagnostic and prognostic service; (ii) treat RA immediately: optimizing outcomes with conventional DMARDs and biologics requires that effective treatment be started early-ideally within 3 months of symptom onset; (iii) tight control of inflammation in RA improves outcome: frequent assessments and an objective protocol should be used to make treatment changes that maintain low-disease activity/remission at an agreed target; (iv) consider the risk-benefit ratio and tailor treatment to each patient: differing patient, disease and drug characteristics require long-term monitoring of risks and benefits with adaptations of treatments to suit individual circumstances. These principles focus on effective control of the inflammatory process in RA, but optimal uptake may require changes in service provision to accommodate appropriate care pathways.

  9. How not to criticize the precautionary principle.

    PubMed

    Hughes, Jonathan

    2006-10-01

    The precautionary principle has its origins in debates about environmental policy, but is increasingly invoked in bioethical contexts. John Harris and Søren Holm argue that the principle should be rejected as incoherent, irrational, and representing a fundamental threat to scientific advance and technological progress. This article argues that while there are problems with standard formulations of the principle, Harris and Holm's rejection of all its forms is mistaken. In particular, they focus on strong versions of the principle and fail to recognize that weaker forms, which may escape their criticisms, are both possible and advocated in the literature.

  10. Role of Fundamental Physics in Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava

    2004-01-01

    This talk will discuss the critical role that fundamental physics research plays for the human space exploration. In particular, the currently available technologies can already provide significant radiation reduction, minimize bone loss, increase crew productivity and, thus, uniquely contribute to overall mission success. I will discuss how fundamental physics research and emerging technologies may not only further reduce the risks of space travel, but also increase the crew mobility, enhance safety and increase the value of space exploration in the near future.

  11. Theoretical principles for biology: Variation.

    PubMed

    Montévil, Maël; Mossio, Matteo; Pocheville, Arnaud; Longo, Giuseppe

    2016-10-01

    Darwin introduced the concept that random variation generates new living forms. In this paper, we elaborate on Darwin's notion of random variation to propose that biological variation should be given the status of a fundamental theoretical principle in biology. We state that biological objects such as organisms are specific objects. Specific objects are special in that they are qualitatively different from each other. They can undergo unpredictable qualitative changes, some of which are not defined before they happen. We express the principle of variation in terms of symmetry changes, where symmetries underlie the theoretical determination of the object. We contrast the biological situation with the physical situation, where objects are generic (that is, different objects can be assumed to be identical) and evolve in well-defined state spaces. We derive several implications of the principle of variation, in particular, biological objects show randomness, historicity and contextuality. We elaborate on the articulation between this principle and the two other principles proposed in this special issue: the principle of default state and the principle of organization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  13. An experimental approach to the fundamental principles of hemodynamics.

    PubMed

    Pontiga, Francisco; Gaytán, Susana P

    2005-09-01

    An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.

  14. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  15. The Elements and Principles of Design: A Baseline Study

    ERIC Educational Resources Information Center

    Adams, Erin

    2013-01-01

    Critical to the discipline, both professionally and academically, are the fundamentals of interior design. These fundamentals include the elements and principles of interior design: the commonly accepted tools and vocabulary used to create and communicate successful interior environments. Research indicates a lack of consistency in both the…

  16. The simplicity principle in perception and cognition.

    PubMed

    Feldman, Jacob

    2016-09-01

    The simplicity principle, traditionally referred to as Occam's razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations- or, more precisely, that it balances a bias toward simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. WIREs Cogn Sci 2016, 7:330-340. doi: 10.1002/wcs.1406 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  17. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  18. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  19. Fundamental principles of an anti-VEGF treatment regimen: optimal application of intravitreal anti-vascular endothelial growth factor therapy of macular diseases.

    PubMed

    Lanzetta, Paolo; Loewenstein, Anat

    2017-07-01

    Intravitreal anti-vascular endothelial growth factor (VEGF) therapy is now considered the gold standard for the treatment of various retinal disorders. As therapy has evolved, so too have the treatment regimens employed by physicians in clinical practice; however, visual outcomes observed in the real world have typically not reflected those reported in clinical trials. Possible reasons for this include a lack of consensus on treatment regimens and a lack of clarity about what the aims of treatment should be. The Vision Academy Steering Committee met to discuss the principles of an ideal treatment regimen, using evidence from the literature to substantiate each point. Literature searches were performed using the MEDLINE/PubMed database (cut-off date: March 2016) and restricted to English-language publications. Studies with fewer than ten patients were excluded from this review. The Steering Committee identified the following four key principles for the ideal treatment regimen for anti-VEGF management of retinal diseases: 1. Maximize and maintain visual acuity (VA) benefits for all patients 2. Decide when to treat next, rather than whether to treat now 3. Titrate the treatment intervals to match patients' needs 4. Treat at each monitoring visit. It is proposed that the adoption of a proactive and more personalized approach in the clinic such as a treat-and-extend regimen will lead to benefits for both the patient and the physician, through a reduction in the associated treatment burden and better utilization of clinic resources. Implementation of the four principles should also lead to better VA outcomes for each patient, with a minimized risk of vision loss.

  20. The fundamental flaw of the HSAB principle is revealed by a complete speciation analysis of the [PtCl(6-n)Br(n)](2-) (n = 0-6) system.

    PubMed

    Gerber, W J; van Wyk, P-H; van Niekerk, D M E; Koch, K R

    2015-02-28

    Bjerrum's model of step-wise ligand exchange is extended to compute a complete speciation diagram for the [PtCl6-nBrn](2-) (n = 0-6) system including all 17 equilibrium constants concerning the Pt(IV) chlorido-bromido exchange reaction network (HERN). In contrast to what the hard soft acid base (HSAB) principle "predicts", the thermodynamic driving force for the replacement of chloride by bromide in an aqueous matrix, for each individual ligand exchange reaction present in the Pt(IV) HERN, is due to the difference in halide hydration energy and not bonding interactions present in the acid-base complex. A generalized thermodynamic test calculation was developed to illustrate that the HSAB classified class (b) metal cations Ag(+), Au(+), Au(3+), Rh(3+), Cd(2+), Pt(2+), Pt(4+), Fe(3+), Cd(2+), Sn(2+) and Zn(2+) all form thermodynamically stable halido complexes in the order F(-) ≫ Cl(-) > Br(-) > I(-) irrespective of the sample matrix. The bonding interactions in the acid-base complex, e.g. ionic-covalent σ-bonding, Π-bonding and electron correlation effects, play no actual role in the classification of these metal cations using the HSAB principle. Instead, it turns out that the hydration/solvation energy of halides is the reason why metal cations are categorized into two classes using the HSAB principle which highlights the fundamental flaw of the HSAB principle.

  1. Itch Management: General Principles.

    PubMed

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. © 2016 S. Karger AG, Basel.

  2. Anesthesia for minimally invasive neurosurgery.

    PubMed

    Prabhakar, Hemanshu; Mahajan, Charu; Kapoor, Indu

    2017-10-01

    With an ultimate aim of improving patients overall outcome and satisfaction, minimally invasive surgical approach is becoming more of a norm. The related anesthetic evidence has not expanded at the same rate as surgical and technological advancement. This article reviews the recent evidence on anesthesia and perioperative concerns for patients undergoing minimally invasive neurosurgery. Minimally invasive cranial and spinal surgeries have been made possible only by vast technological development. Points of surgical interest can be precisely located with the help of stereotaxy and neuronavigation and special endoscopes which decrease the tissue trauma. The principles of neuroanethesia remain the same, but few concerns are specific for each technique. Dexmedetomidine has a favorable profile for procedures carried out under sedation technique. As the new surgical techniques are coming up, lesser known anesthetic concerns may also come into light. Over the last year, little new information has been added to existing literature regarding anesthesia for minimally invasive neurosurgeries. Neuroanesthesia goals remain the same and less invasive surgical techniques do not translate into safe anesthesia. Specific concerns for each procedure should be taken into consideration.

  3. Evolved Minimal Frustration in Multifunctional Biomolecules.

    PubMed

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  4. Fundamental Principles of Classical Mechanics: a Geometrical Perspectives

    NASA Astrophysics Data System (ADS)

    Lam, Kai S.

    2014-07-01

    Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...

  5. How to start a minimal access mitral valve program.

    PubMed

    Hunter, Steven

    2013-11-01

    The seven pillars of governance established by the National Health Service in the United Kingdom provide a useful framework for the process of introducing new procedures to a hospital. Drawing from local experience, the author present guidance for institutions considering establishing a minimal access mitral valve program. The seven pillars of governance apply to the practice of minimally invasive mitral valve surgery, based on the principle of patient-centred practice. The author delineate the benefits of minimally invasive mitral valve surgery in terms of: "clinical effectiveness", including reduced length of hospital stay, "risk management effectiveness", including conversion to sternotomy and aortic dissection, "patient experience" including improved cosmesis and quicker recovery, and the effectiveness of communication, resources and strategies in the implementation of minimally invasive mitral valve surgery. Finally, the author have identified seven learning curves experienced by surgeons involved in introducing a minimal access mitral valve program. The learning curves are defined as: techniques of mitral valve repair, Transoesophageal Echocardiography-guided cannulation, incisions, instruments, visualization, aortic occlusion and cardiopulmonary bypass strategies. From local experience, the author provide advice on how to reduce the learning curves, such as practising with the specialised instruments and visualization techniques during sternotomy cases. Underpinning the NHS pillars are the principles of systems awareness, teamwork, communication, ownership and leadership, all of which are paramount to performing any surgery but more so with minimal access surgery, as will be highlighted throughout this paper.

  6. How to start a minimal access mitral valve program

    PubMed Central

    2013-01-01

    The seven pillars of governance established by the National Health Service in the United Kingdom provide a useful framework for the process of introducing new procedures to a hospital. Drawing from local experience, the author present guidance for institutions considering establishing a minimal access mitral valve program. The seven pillars of governance apply to the practice of minimally invasive mitral valve surgery, based on the principle of patient-centred practice. The author delineate the benefits of minimally invasive mitral valve surgery in terms of: “clinical effectiveness”, including reduced length of hospital stay, “risk management effectiveness”, including conversion to sternotomy and aortic dissection, “patient experience” including improved cosmesis and quicker recovery, and the effectiveness of communication, resources and strategies in the implementation of minimally invasive mitral valve surgery. Finally, the author have identified seven learning curves experienced by surgeons involved in introducing a minimal access mitral valve program. The learning curves are defined as: techniques of mitral valve repair, Transoesophageal Echocardiography-guided cannulation, incisions, instruments, visualization, aortic occlusion and cardiopulmonary bypass strategies. From local experience, the author provide advice on how to reduce the learning curves, such as practising with the specialised instruments and visualization techniques during sternotomy cases. Underpinning the NHS pillars are the principles of systems awareness, teamwork, communication, ownership and leadership, all of which are paramount to performing any surgery but more so with minimal access surgery, as will be highlighted throughout this paper. PMID:24349981

  7. Massively parallel GPU-accelerated minimization of classical density functional theory

    NASA Astrophysics Data System (ADS)

    Stopper, Daniel; Roth, Roland

    2017-08-01

    In this paper, we discuss the ability to numerically minimize the grand potential of hard disks in two-dimensional and of hard spheres in three-dimensional space within the framework of classical density functional and fundamental measure theory on modern graphics cards. Our main finding is that a massively parallel minimization leads to an enormous performance gain in comparison to standard sequential minimization schemes. Furthermore, the results indicate that in complex multi-dimensional situations, a heavy parallel minimization of the grand potential seems to be mandatory in order to reach a reasonable balance between accuracy and computational cost.

  8. Integration of Social Studies Principles in the Home Economics Curriculum.

    ERIC Educational Resources Information Center

    Texas Tech Univ., Lubbock. Home Economics Curriculum Center.

    This document is intended to help secondary home economics teachers incorporate social studies principles into their curriculum. After an introduction, the document is divided into three sections. The first section identifies and explains fundamental principles within social studies and covers the history and current state of the social studies…

  9. Effects of Phonetic Context on Relative Fundamental Frequency

    ERIC Educational Resources Information Center

    Lien, Yu-An S.; Gattuccio, Caitlin I.; Stepp, Cara E.

    2014-01-01

    Purpose: The effect of phonetic context on relative fundamental frequency (RFF) was examined, in order to develop stimuli sets with minimal within-speaker variability that can be implemented in future clinical protocols. Method: Sixteen speakers with healthy voices produced RFF stimuli. Uniform utterances consisted of 3 repetitions of the same…

  10. The principles of teratology: are they still true?

    PubMed

    Friedman, Jan M

    2010-10-01

    James Wilson originally proposed a set of "Principles of Teratology" in 1959, the year before he helped to found the Teratology Society. By 1977, when these Principles were presented in a more definitive form in Wilson and Fraser's Handbook of Teratology, they had become a standard formulation of the basic tenets of the field. Wilson's Principles have continued to guide scientific research in teratology, and they are widely used in teaching. Recent advances in our knowledge of the molecular and cellular bases of embryogenesis serve only to provide a deeper understanding of the fundamental developmental mechanisms that underlie Wilson's Principles of Teratology. © 2010 Wiley-Liss, Inc.

  11. The maximum work principle regarded as a consequence of an optimization problem based on mechanical virtual power principle and application of constructal theory

    NASA Astrophysics Data System (ADS)

    Gavrus, Adinel

    2017-10-01

    This scientific paper proposes to prove that the maximum work principle used by theory of continuum media plasticity can be regarded as a consequence of an optimization problem based on constructal theory (prof. Adrian BEJAN). It is known that the thermodynamics define the conservation of energy and the irreversibility of natural systems evolution. From mechanical point of view the first one permits to define the momentum balance equation, respectively the virtual power principle while the second one explains the tendency of all currents to flow from high to low values. According to the constructal law all finite-size system searches to evolve in such configurations that flow more and more easily over time distributing the imperfections in order to maximize entropy and to minimize the losses or dissipations. During a material forming process the application of constructal theory principles leads to the conclusion that under external loads the material flow is that which all dissipated mechanical power (deformation and friction) become minimal. On a mechanical point of view it is then possible to formulate the real state of all mechanical variables (stress, strain, strain rate) as those that minimize the total dissipated power. So between all other virtual non-equilibrium states, the real state minimizes the total dissipated power. It can be then obtained a variational minimization problem and this paper proof in a mathematical sense that starting from this formulation can be finding in a more general form the maximum work principle together with an equivalent form for the friction term. An application in the case of a plane compression of a plastic material shows the feasibility of the proposed minimization problem formulation to find analytical solution for both cases: one without friction influence and a second which take into account Tresca friction law. To valid the proposed formulation, a comparison with a classical analytical analysis based on slices, upper

  12. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  13. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  14. [The present and future state of minimized extracorporeal circulation].

    PubMed

    Meng, Fan; Yang, Ming

    2013-05-01

    Minimized extracorporeal circulation improved in the postoperative side effects of conventional extracorporeal circulation is a kind of new extracorporeal circulation. This paper introduces the principle, characteristics, applications and related research of minimized extracorporeal circulation. For the problems of systemic inflammatory response syndrome and limited assist time, the article proposes three development direction including system miniaturization and integration, pulsatile blood pump and the adaptive control by human parameter identification.

  15. [Ethical principles of clinical trials in minors].

    PubMed

    Koch, H J; Raschka, C

    2002-12-05

    Clinical trials in volunteers and patients are essential to ensure rational treatment of patients. As a rule, drugs are routinely developed for adults, but children are excluded. A major reason for this restriction are ethical justifications, in particular the lack of autonomy on the part of children. The principle of fairness, however, requires that everyone should benefit from progress. Industry, science and society are therefore called upon to find ways of making available safe and adequate treatment for children as quickly as possible, by defining the required conditions for pediatric clinical trials. Important principles are minimal risk, minimal invasivity, rapid decision-making, and careful documentation of trial results. Dynamic ethical principles, such as autonomy and competence in adolescents must be considered on equal footing with existing international GCP guidelines. Aspects of child psychology indicate that the autonomy of adolescents should be respected. Where economic incentives for such trials are absent, for example, in the case of non-pharmacological problems, pediatric trials must be considered a task for society as a whole.

  16. U.S. Geological Survey Fundamental Science Practices

    USGS Publications Warehouse

    ,

    2011-01-01

    The USGS has a long and proud tradition of objective, unbiased science in service to the Nation. A reputation for impartiality and excellence is one of our most important assets. To help preserve this vital asset, in 2004 the Executive Leadership Team (ELT) of the USGS was charged by the Director to develop a set of fundamental science practices, philosophical premises, and operational principles as the foundation for all USGS research and monitoring activities. In a concept document, 'Fundamental Science Practices of the U.S. Geological Survey', the ELT proposed 'a set of fundamental principles to underlie USGS science practices.' The document noted that protecting the reputation of USGS science for quality and objectivity requires the following key elements: - Clearly articulated, Bureau-wide fundamental science practices. - A shared understanding at all levels of the organization that the health and future of the USGS depend on following these practices. - The investment of budget, time, and people to ensure that the USGS reputation and high-quality standards are maintained. The USGS Fundamental Science Practices (FSP) encompass all elements of research investigations, including data collection, experimentation, analysis, writing results, peer review, management review, and Bureau approval and publication of information products. The focus of FSP is on how science is carried out and how products are produced and disseminated. FSP is not designed to address the question of what work the USGS should do; that is addressed in USGS science planning handbooks and other documents. Building from longstanding existing USGS policies and the ELT concept document, in May 2006, FSP policies were developed with input from all parts of the organization and were subsequently incorporated into the Bureau's Survey Manual. In developing an implementation plan for FSP policy, the intent was to recognize and incorporate the best of USGS current practices to obtain the optimum

  17. 17 CFR 38.850 - Core Principle 16.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... MARKETS Conflicts of Interest § 38.850 Core Principle 16. The board of trade shall establish and enforce rules: (a) To minimize conflicts of interest in the decision-making process of the contract market; and...

  18. 17 CFR 38.850 - Core Principle 16.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... MARKETS Conflicts of Interest § 38.850 Core Principle 16. The board of trade shall establish and enforce rules: (a) To minimize conflicts of interest in the decision-making process of the contract market; and...

  19. Basic principles of variable speed drives

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1973-01-01

    An understanding of the principles which govern variable speed drive operation is discussed for successful drive application. The fundamental factors of torque, speed ratio, and power as they relate to drive selection are discussed. The basic types of variable speed drives, their operating characteristics and their applications are also presented.

  20. Structural principles for computational and de novo design of 4Fe-4S metalloproteins

    PubMed Central

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H.; Rodriguez-Granillo, Agustina; Hansen, Will; Khare, Sagar D.; Noy, Dror

    2017-01-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. PMID:26449207

  1. Does quantity generate quality? Testing the fundamental principle of brainstorming.

    PubMed

    Muñoz Adánez, Alfredo

    2005-11-01

    The purpose of this work is to test the chief principle of brainstorming, formulated as "quantity generates quality." The study is included within a broad program whose goal is to detect the strong and weak points of creative techniques. In a sample of 69 groups, containing between 3 and 8 members, the concurrence of two commonly accepted criteria was established as a quality rule: originality and utility or value. The results fully support the quantity-quality relation (r = .893): the more ideas produced to solve a problem, the better quality of the ideas. The importance of this finding, which supports Osborn's theory, is discussed, and the use of brainstorming is recommended to solve the many open problems faced by our society.

  2. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  3. Optimality Principles for Model-Based Prediction of Human Gait

    PubMed Central

    Ackermann, Marko; van den Bogert, Antonie J.

    2010-01-01

    Although humans have a large repertoire of potential movements, gait patterns tend to be stereotypical and appear to be selected according to optimality principles such as minimal energy. When applied to dynamic musculoskeletal models such optimality principles might be used to predict how a patient’s gait adapts to mechanical interventions such as prosthetic devices or surgery. In this paper we study the effects of different performance criteria on predicted gait patterns using a 2D musculoskeletal model. The associated optimal control problem for a family of different cost functions was solved utilizing the direct collocation method. It was found that fatigue-like cost functions produced realistic gait, with stance phase knee flexion, as opposed to energy-related cost functions which avoided knee flexion during the stance phase. We conclude that fatigue minimization may be one of the primary optimality principles governing human gait. PMID:20074736

  4. Evidence-based prosthodontics: fundamental considerations, limitations, and guidelines.

    PubMed

    Bidra, Avinash S

    2014-01-01

    Evidence-based dentistry is rapidly emerging to become an integral part of patient care, dental education, and research. Prosthodontics is a unique dental specialty that encompasses art, philosophy, and science and includes reversible and irreversible treatments. It not only affords good applicability of many principles of evidence-based dentistry but also poses numerous limitations. This article describes the epidemiologic background, fundamental considerations, scrutiny of levels of evidence, limitations, guidelines, and future perspectives of evidence-based prosthodontics. Understanding these principles can aid clinicians in appropriate appraisal of the prosthodontics literature and use the best available evidence for making confident clinical decisions and optimizing patient care. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. CE-TOF/MS: fundamental concepts, instrumental considerations and applications.

    PubMed

    Staub, Aline; Schappler, Julie; Rudaz, Serge; Veuthey, Jean-Luc

    2009-05-01

    This review discusses the fundamental principles of TOF analyzers and covers the great progress that has been made in this area in recent years (i.e. orthogonal acceleration, reflectron). This paper also gives an overview of applications performed by CE coupled to TOF/MS detection. The main domains of interest include the analysis of biomolecules and natural compounds.

  6. Fundamental Principles of Network Formation among Preschool Children1

    PubMed Central

    Schaefer, David R.; Light, John M.; Fabes, Richard A.; Hanish, Laura D.; Martin, Carol Lynn

    2009-01-01

    The goal of this research was to investigate the origins of social networks by examining the formation of children’s peer relationships in 11 preschool classes throughout the school year. We investigated whether several fundamental processes of relationship formation were evident at this age, including reciprocity, popularity, and triadic closure effects. We expected these mechanisms to change in importance over time as the network crystallizes, allowing more complex structures to evolve from simpler ones in a process we refer to as structural cascading. We analyzed intensive longitudinal observational data of children’s interactions using the SIENA actor-based model. We found evidence that reciprocity, popularity, and triadic closure all shaped the formation of preschool children’s networks. The influence of reciprocity remained consistent, whereas popularity and triadic closure became increasingly important over the course of the school year. Interactions between age and endogenous network effects were nonsignificant, suggesting that these network formation processes were not moderated by age in this sample of young children. We discuss the implications of our longitudinal network approach and findings for the study of early network developmental processes. PMID:20161606

  7. Single-Atom Demonstration of the Quantum Landauer Principle

    NASA Astrophysics Data System (ADS)

    Yan, L. L.; Xiong, T. P.; Rehan, K.; Zhou, F.; Liang, D. F.; Chen, L.; Zhang, J. Q.; Yang, W. L.; Ma, Z. H.; Feng, M.

    2018-05-01

    One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question. Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. Our experimental investigation substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.

  8. Religious Fundamentalism among Young Muslims in Egypt and Saudi Arabia

    ERIC Educational Resources Information Center

    Moaddel, Mansoor; Karabenick, Stuart A.

    2008-01-01

    Religious fundamentalism is conceived as a distinctive set of beliefs and attitudes toward one's religion, including obedience to religious norms, belief in the universality and immutability of its principles, the validity of its claims, and its indispensability for human happiness. Surveys of Egyptian and Saudi youth, ages 18-25, reveal that…

  9. Randomness Amplification under Minimal Fundamental Assumptions on the Devices

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Brandão, Fernando G. S. L.; Horodecki, Karol; Horodecki, Michał; Horodecki, Paweł; Wojewódka, Hanna

    2016-12-01

    Recently, the physically realistic protocol amplifying the randomness of Santha-Vazirani sources producing cryptographically secure random bits was proposed; however, for reasons of practical relevance, the crucial question remained open regarding whether this can be accomplished under the minimal conditions necessary for the task. Namely, is it possible to achieve randomness amplification using only two no-signaling components and in a situation where the violation of a Bell inequality only guarantees that some outcomes of the device for specific inputs exhibit randomness? Here, we solve this question and present a device-independent protocol for randomness amplification of Santha-Vazirani sources using a device consisting of two nonsignaling components. We show that the protocol can amplify any such source that is not fully deterministic into a fully random source while tolerating a constant noise rate and prove the composable security of the protocol against general no-signaling adversaries. Our main innovation is the proof that even the partial randomness certified by the two-party Bell test [a single input-output pair (u* , x* ) for which the conditional probability P (x*|u*) is bounded away from 1 for all no-signaling strategies that optimally violate the Bell inequality] can be used for amplification. We introduce the methodology of a partial tomographic procedure on the empirical statistics obtained in the Bell test that ensures that the outputs constitute a linear min-entropy source of randomness. As a technical novelty that may be of independent interest, we prove that the Santha-Vazirani source satisfies an exponential concentration property given by a recently discovered generalized Chernoff bound.

  10. Prospects and fundamental limitations of room temperature, non-avalanche, semiconductor photon-counting sensors (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ma, Jiaju; Zhang, Yang; Wang, Xiaoxin; Ying, Lei; Masoodian, Saleh; Wang, Zhiyuan; Starkey, Dakota A.; Deng, Wei; Kumar, Rahul; Wu, Yang; Ghetmiri, Seyed Amir; Yu, Zongfu; Yu, Shui-Qing; Salamo, Gregory J.; Fossum, Eric R.; Liu, Jifeng

    2017-05-01

    This research investigates the fundamental limits and trade-space of quantum semiconductor photodetectors using the Schrödinger equation and the laws of thermodynamics.We envision that, to optimize the metrics of single photon detection, it is critical to maximize the optical absorption in the minimal volume and minimize the carrier transit process simultaneously. Integration of photon management with quantum charge transport/redistribution upon optical excitation can be engineered to maximize the quantum efficiency (QE) and data rate and minimize timing jitter at the same time. Due to the ultra-low capacitance of these quantum devices, even a single photoelectron transfer can induce a notable change in the voltage, enabling non-avalanche single photon detection at room temperature as has been recently demonstrated in Si quanta image sensors (QIS). In this research, uniform III-V quantum dots (QDs) and Si QIS are used as model systems to test the theory experimentally. Based on the fundamental understanding, we also propose proof-of-concept, photon-managed quantum capacitance photodetectors. Built upon the concepts of QIS and single electron transistor (SET), this novel device structure provides a model system to synergistically test the fundamental limits and tradespace predicted by the theory for semiconductor detectors. This project is sponsored under DARPA/ARO's DETECT Program: Fundamental Limits of Quantum Semiconductor Photodetectors.

  11. Fundamental Aspects of Single Molecule and Zeptomole Electroanalysis

    DTIC Science & Technology

    2018-04-01

    objective of our research program was to provide the fundamental understanding required for using the principles of electroanalytical chemistry to detect...report is organized in terms of research in the individual co-PI laboratories. Figure 1. A probe DNA sequence (red) immobilized onto a nanoscale...were tested on both Au microelectrodes, an Au microband in a microfluidic device, and an Au microband in a microfluidic device in the presence of a

  12. Development of Canonical Transformations from Hamilton's Principle.

    ERIC Educational Resources Information Center

    Quade, C. Richard

    1979-01-01

    The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)

  13. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  14. Common principles and multiculturalism.

    PubMed

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.

  15. Lessons from the GP-B Experience for Future Fundamental Physics Missions in Space

    NASA Technical Reports Server (NTRS)

    Kolodziejczak, Jeffery

    2006-01-01

    Gravity Probe B launched in April 2004 and completed its science data collection in September 2005, with the objective of sub-milliarcsec measurement of two General Relativistic effects on the spin axis orientation of orbiting gyroscopes. Much of the technology required by GP-B has potential application in future missions intended to make precision measurements. The philosophical approach and experiment design principles developed for GP-B are equally adaptable to these mission concepts. This talk will discuss GP-B's experimental approach and the technological and philosophical lessons learned that apply to future experiments in fundamental physics. Measurement of fundamental constants to high precision, probes of short-range forces, searches for equivalence principle violations, and detection of gravitational waves are examples of concepts and missions that will benefit kern GP-B's experience.

  16. The Particle Adventure | What is fundamental? | Fundamental

    Science.gov Websites

    fundamental The atom Is the atom fundamental? Is the nucleus fundamental? Are protons and neutrons fundamental decay What is the Mechanism giving mass to fundamental particles? What is the Mechanism giving mass to fundamental particles? Part 2 How Does the Higgs Boson get its Mass? Finding the Mass of the Higgs Boson

  17. Ground-state densities from the Rayleigh-Ritz variation principle and from density-functional theory.

    PubMed

    Kvaal, Simen; Helgaker, Trygve

    2015-11-14

    The relationship between the densities of ground-state wave functions (i.e., the minimizers of the Rayleigh-Ritz variation principle) and the ground-state densities in density-functional theory (i.e., the minimizers of the Hohenberg-Kohn variation principle) is studied within the framework of convex conjugation, in a generic setting covering molecular systems, solid-state systems, and more. Having introduced admissible density functionals as functionals that produce the exact ground-state energy for a given external potential by minimizing over densities in the Hohenberg-Kohn variation principle, necessary and sufficient conditions on such functionals are established to ensure that the Rayleigh-Ritz ground-state densities and the Hohenberg-Kohn ground-state densities are identical. We apply the results to molecular systems in the Born-Oppenheimer approximation. For any given potential v ∈ L(3/2)(ℝ(3)) + L(∞)(ℝ(3)), we establish a one-to-one correspondence between the mixed ground-state densities of the Rayleigh-Ritz variation principle and the mixed ground-state densities of the Hohenberg-Kohn variation principle when the Lieb density-matrix constrained-search universal density functional is taken as the admissible functional. A similar one-to-one correspondence is established between the pure ground-state densities of the Rayleigh-Ritz variation principle and the pure ground-state densities obtained using the Hohenberg-Kohn variation principle with the Levy-Lieb pure-state constrained-search functional. In other words, all physical ground-state densities (pure or mixed) are recovered with these functionals and no false densities (i.e., minimizing densities that are not physical) exist. The importance of topology (i.e., choice of Banach space of densities and potentials) is emphasized and illustrated. The relevance of these results for current-density-functional theory is examined.

  18. Values and principles evident in current health promotion practice.

    PubMed

    Gregg, Jane; O'Hara, Lily

    2007-04-01

    Modern health promotion practice needs to respond to complex health issues that have multiple interrelated determinants. This requires an understanding of the values and principles of health promotion. A literature review was undertaken to explore the values and principles evident in current health promotion theory and practice. A broad range of values and principles are espoused as being integral to modern health promotion theory and practice. Although there are some commonalities across these lists, there is no recognised, authoritative set of values and principles accepted as fundamental and applicable to modern health promotion. There is a continuum of values and principles evident in health promotion practice from those associated with holistic, ecological, salutogenic health promotion to those more in keeping with conventional health promotion. There is a need for a system of values and principles consistent with modern health promotion that enables practitioners to purposefully integrate these values and principles into their understanding of health, as well as their needs assessment, planning, implementation and evaluation practice.

  19. Solar-System Bodies as Teaching Tools in Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Genus, Amelia; Overduin, James

    2018-01-01

    We show how asteroids can be used as teaching tools in fundamental physics. Current gravitational theory assumes that all bodies fall with the same acceleration in the same gravitational field. But this assumption, known as the Equivalence Principle, is violated to some degree in nearly all theories that attempt to unify gravitation with the other fundamental forces of nature. In such theories, bodies with different compositions can fall at different rates, producing small non-Keplerian distortions in their orbits. We focus on the unique all-metal asteroid 16 Psyche as a test case. Using Kepler’s laws of planetary motion together with recent observational data on the orbital motions of Psyche and its neighbors, students are able to derive new constraints on current theories in fundamental physics. These constraints take on particular interest since NASA has just announced plans to visit Psyche in 2026.

  20. Fundamentals of Structural Geology

    NASA Astrophysics Data System (ADS)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  1. A minimization principle for the description of modes associated with finite-time instabilities

    PubMed Central

    Babaee, H.

    2016-01-01

    We introduce a minimization formulation for the determination of a finite-dimensional, time-dependent, orthonormal basis that captures directions of the phase space associated with transient instabilities. While these instabilities have finite lifetime, they can play a crucial role either by altering the system dynamics through the activation of other instabilities or by creating sudden nonlinear energy transfers that lead to extreme responses. However, their essentially transient character makes their description a particularly challenging task. We develop a minimization framework that focuses on the optimal approximation of the system dynamics in the neighbourhood of the system state. This minimization formulation results in differential equations that evolve a time-dependent basis so that it optimally approximates the most unstable directions. We demonstrate the capability of the method for two families of problems: (i) linear systems, including the advection–diffusion operator in a strongly non-normal regime as well as the Orr–Sommerfeld/Squire operator, and (ii) nonlinear problems, including a low-dimensional system with transient instabilities and the vertical jet in cross-flow. We demonstrate that the time-dependent subspace captures the strongly transient non-normal energy growth (in the short-time regime), while for longer times the modes capture the expected asymptotic behaviour. PMID:27118900

  2. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  3. Cortical Composition Hierarchy Driven by Spine Proportion Economical Maximization or Wire Volume Minimization

    PubMed Central

    Karbowski, Jan

    2015-01-01

    The structure and quantitative composition of the cerebral cortex are interrelated with its computational capacity. Empirical data analyzed here indicate a certain hierarchy in local cortical composition. Specifically, neural wire, i.e., axons and dendrites take each about 1/3 of cortical space, spines and glia/astrocytes occupy each about (1/3)2, and capillaries around (1/3)4. Moreover, data analysis across species reveals that these fractions are roughly brain size independent, which suggests that they could be in some sense optimal and thus important for brain function. Is there any principle that sets them in this invariant way? This study first builds a model of local circuit in which neural wire, spines, astrocytes, and capillaries are mutually coupled elements and are treated within a single mathematical framework. Next, various forms of wire minimization rule (wire length, surface area, volume, or conduction delays) are analyzed, of which, only minimization of wire volume provides realistic results that are very close to the empirical cortical fractions. As an alternative, a new principle called “spine economy maximization” is proposed and investigated, which is associated with maximization of spine proportion in the cortex per spine size that yields equally good but more robust results. Additionally, a combination of wire cost and spine economy notions is considered as a meta-principle, and it is found that this proposition gives only marginally better results than either pure wire volume minimization or pure spine economy maximization, but only if spine economy component dominates. However, such a combined meta-principle yields much better results than the constraints related solely to minimization of wire length, wire surface area, and conduction delays. Interestingly, the type of spine size distribution also plays a role, and better agreement with the data is achieved for distributions with long tails. In sum, these results suggest that for the

  4. Minimizing Accidents and Risks in High Adventure Outdoor Pursuits.

    ERIC Educational Resources Information Center

    Meier, Joel

    The fundamental dilemma in adventure programming is eliminating unreasonable risks to participants without also reducing levels of excitement, challenge, and stress. Most accidents are caused by a combination of unsafe conditions, unsafe acts, and error judgments. The best and only way to minimize critical human error in adventure programs is…

  5. Minimally invasive technology in the management of breast disease.

    PubMed

    Hung, W K; Ying, M; Chan, C M; Lam, H S; Mak, K L

    2009-01-01

    Minimally invasive surgery is gaining popularity around the world because it achieves the same or even superior results when compared to standard surgery but with less morbidity. Minimally invasive breast surgery is a broad concept encompassing new developments in the field of breast surgery that work on this minimally invasive principle. In this regard, breast-conserving surgery and sentinel lymph node biopsy are good illustrations of this concept. There are three major areas of progress in the minimally invasive management of breast disease. First, percutaneous excisional devices are now available that can replace the surgical excision of breast mass lesions. Second, various ablative treatments are capable of destroying breast cancers in situ instead of surgical excision. Third, mammary ductoscopy provides a new approach to the investigation of mammary duct pathology. Clinical experience and potential applications of these new technologies are reviewed.

  6. Principles of Guided Missiles and Nuclear Weapons.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…

  7. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    NASA Astrophysics Data System (ADS)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  8. Detection principles of biological and chemical FET sensors.

    PubMed

    Kaisti, Matti

    2017-12-15

    The seminal importance of detecting ions and molecules for point-of-care tests has driven the search for more sensitive, specific, and robust sensors. Electronic detection holds promise for future miniaturized in-situ applications and can be integrated into existing electronic manufacturing processes and technology. The resulting small devices will be inherently well suited for multiplexed and parallel detection. In this review, different field-effect transistor (FET) structures and detection principles are discussed, including label-free and indirect detection mechanisms. The fundamental detection principle governing every potentiometric sensor is introduced, and different state-of-the-art FET sensor structures are reviewed. This is followed by an analysis of electrolyte interfaces and their influence on sensor operation. Finally, the fundamentals of different detection mechanisms are reviewed and some detection schemes are discussed. In the conclusion, current commercial efforts are briefly considered. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Fundamental gaps with approximate density functionals: The derivative discontinuity revealed from ensemble considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraisler, Eli; Kronik, Leeor

    2014-05-14

    The fundamental gap is a central quantity in the electronic structure of matter. Unfortunately, the fundamental gap is not generally equal to the Kohn-Sham gap of density functional theory (DFT), even in principle. The two gaps differ precisely by the derivative discontinuity, namely, an abrupt change in slope of the exchange-correlation energy as a function of electron number, expected across an integer-electron point. Popular approximate functionals are thought to be devoid of a derivative discontinuity, strongly compromising their performance for prediction of spectroscopic properties. Here we show that, in fact, all exchange-correlation functionals possess a derivative discontinuity, which arises naturallymore » from the application of ensemble considerations within DFT, without any empiricism. This derivative discontinuity can be expressed in closed form using only quantities obtained in the course of a standard DFT calculation of the neutral system. For small, finite systems, addition of this derivative discontinuity indeed results in a greatly improved prediction for the fundamental gap, even when based on the most simple approximate exchange-correlation density functional – the local density approximation (LDA). For solids, the same scheme is exact in principle, but when applied to LDA it results in a vanishing derivative discontinuity correction. This failure is shown to be directly related to the failure of LDA in predicting fundamental gaps from total energy differences in extended systems.« less

  10. Design and Establishment of Quality Model of Fundamental Geographic Information Database

    NASA Astrophysics Data System (ADS)

    Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.

    2018-04-01

    In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.

  11. Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing

    ERIC Educational Resources Information Center

    Maddi, Salvatore R.

    2007-01-01

    Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…

  12. Devising Principles of Design for Numeracy Tasks

    ERIC Educational Resources Information Center

    Geiger, Vince; Forgasz, Helen; Goos, Merrilyn; Bennison, Anne

    2014-01-01

    Numeracy is a fundamental component of the Australian National Curriculum as a General Capability identified in each F-10 subject. In this paper, we consider the principles of design necessary for the development of numeracy tasks specific to subjects other than mathematics--in this case, the subject of English. We explore the nature of potential…

  13. [Evidence-based medicine as a fundamental principle of health care management for workers].

    PubMed

    Amirov, N Kh; Fatkhutdinova, L M

    2011-01-01

    Evidence-based principles in occupational medicine should include prevention, diagnosis, treatment and rehabilitation. Specific feature of occupational medicine is necessity to prove cause-effect relationships between occupational factor and the disease emerged. Important place is occupied by cohort and intervention studies, systematic reviews and meta-analysis. Information obtained by scientific society should be presented to practical specialists and put into everyday activities.

  14. Update on SU(2) gauge theory with NF = 2 fundamental flavours.

    NASA Astrophysics Data System (ADS)

    Drach, Vincent; Janowski, Tadeusz; Pica, Claudio

    2018-03-01

    We present a non perturbative study of SU(2) gauge theory with two fundamental Dirac flavours. This theory provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics, such as a minimal realization of composite Higgs models. We present an update on the status of the meson spectrum and decay constants based on increased statistics on our existing ensembles and the inclusion of new ensembles with lighter pion masses, resulting in a more reliable chiral extrapolation. Preprint: CP3-Origins-2017-048 DNRF90

  15. Supercritical fluid extraction. Principles and practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McHugh, M.A.; Krukonis, V.J.

    This book is a presentation of the fundamentals and application of super-critical fluid solvents (SCF). The authors cover virtually every facet of SCF technology: the history of SCF extraction, its underlying thermodynamic principles, process principles, industrial applications, and analysis of SCF research and development efforts. The thermodynamic principles governing SCF extraction are covered in depth. The often complex three-dimensional pressure-temperature composition (PTx) phase diagrams for SCF-solute mixtures are constructed in a coherent step-by-step manner using the more familiar two-dimensional Px diagrams. The experimental techniques used to obtain high pressure phase behavior information are described in detail and the advantages andmore » disadvantages of each technique are explained. Finally, the equations used to model SCF-solute mixtures are developed, and modeling results are presented to highlight the correlational strengths of a cubic equation of state.« less

  16. Richard Day Deslattes, 21 Sept 1931 - 16 May 2001: Calibration of light, matter and fundamental constants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chantler, C.T.

    2003-01-24

    Richard Deslattes passed away on 16 May 2001 after a life dedicated to fundamental metrology. Although the themes of calibrating light, matter and fundamental constants can give three guiding principles through his career, the wide-ranging nature of his areas of interest are encompassed by over 165 refereed publications with several cited over 100 times. He has left an enduring legacy to science.

  17. Insurance principles and the design of prospective payment systems.

    PubMed

    Ellis, R P; McGuire, T G

    1988-09-01

    This paper applies insurance principles to the issues of optimal outlier payments and designation of peer groups in Medicare's case-based prospective payment system for hospital care. Arrow's principle that full insurance after a deductible is optimal implies that, to minimize hospital risk, outlier payments should be based on hospital average loss per case rather than, as at present, based on individual case-level losses. The principle of experience rating implies defining more homogenous peer groups for the purpose of figuring average cost. The empirical significance of these results is examined using a sample of 470,568 discharges from 469 hospitals.

  18. Autonomous intelligent cars: proof that the EPSRC Principles are future-proof

    NASA Astrophysics Data System (ADS)

    de Cock Buning, Madeleine; de Bruin, Roeland

    2017-07-01

    Principle 2 of the EPSRC's principles of robotics (AISB workshop on Principles of Robotics, 2016) proves to be future proof when applied to the current state of the art of law and technology surrounding autonomous intelligent cars (AICs). Humans, not AICS, are responsible agents. AICs should be designed; operated as far as is practicable to comply with existing laws and fundamental rights and freedoms, including privacy by design. It will show that some legal questions arising from autonomous intelligent driving technology can be answered by the technology itself.

  19. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    ERIC Educational Resources Information Center

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  20. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  1. AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XVI, LEARNING ABOUT AC GENERATOR (ALTERNATOR) PRINCIPLES (PART I).

    ERIC Educational Resources Information Center

    Human Engineering Inst., Cleveland, OH.

    THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATING PRINCIPLES OF ALTERNATING CURRENT GENERATORS USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE REVIEWING ELECTRICAL FUNDAMENTALS, AND OPERATING PRINCIPLES OF ALTERNATORS. THE MODULE CONSISTS OF A SELF-INSTRUCTIONAL PROGRAMED TRAINING FILM "AC GENERATORS…

  2. Nonlinear pulse compression in pulse-inversion fundamental imaging.

    PubMed

    Cheng, Yun-Chien; Shen, Che-Chou; Li, Pai-Chi

    2007-04-01

    Coded excitation can be applied in ultrasound contrast agent imaging to enhance the signal-to-noise ratio with minimal destruction of the microbubbles. Although the axial resolution is usually compromised by the requirement for a long coded transmit waveforms, this can be restored by using a compression filter to compress the received echo. However, nonlinear responses from microbubbles may cause difficulties in pulse compression and result in severe range side-lobe artifacts, particularly in pulse-inversion-based (PI) fundamental imaging. The efficacy of pulse compression in nonlinear contrast imaging was evaluated by investigating several factors relevant to PI fundamental generation using both in-vitro experiments and simulations. The results indicate that the acoustic pressure and the bubble size can alter the nonlinear characteristics of microbubbles and change the performance of the compression filter. When nonlinear responses from contrast agents are enhanced by using a higher acoustic pressure or when more microbubbles are near the resonance size of the transmit frequency, higher range side lobes are produced in both linear imaging and PI fundamental imaging. On the other hand, contrast detection in PI fundamental imaging significantly depends on the magnitude of the nonlinear responses of the bubbles and thus the resultant contrast-to-tissue ratio (CTR) still increases with acoustic pressure and the nonlinear resonance of microbubbles. It should be noted, however, that the CTR in PI fundamental imaging after compression is consistently lower than that before compression due to obvious side-lobe artifacts. Therefore, the use of coded excitation is not beneficial in PI fundamental contrast detection.

  3. Solar activity cycles: indication of the existence of fundamental symmetry?

    NASA Astrophysics Data System (ADS)

    Dreschhoff, Gisela; Wong, Kai Wai; Curatolo, Susana; Jungner, Hogne; Perry, Charles

    Previous work has shown that there is a consistent pattern that seems to be underlying the various known solar activity cycles, which is fundamentally based on the nuclear magnetic resonance frequencies (NMR) of some of the main isotopic constituents within the solar core, hydrogen-1 F(H-1)NMR and helium-3 F(He-3)NMR [1], and resulting in a so-called "beat-frequency", thereby suggesting that this mechanism may involve the entire Sun. Furthermore, it was found that the energy generating region of the Sun may be governed by an optimum condition where F(He-3)NMR = F(H-1)NMR associated with an internal magnetic field of 7 Gauss, and the beat-frequency Fbeat representing the Schwabe periodicity [2]. Using the Schwabe cycle as the basic cycle length (C2), the astronomical and geophysical data (solar activity cycles C1) are represented by a fundamental harmonic progression of the form C1 = C2 x 2n. We will attempt to show that this type of harmonic progression can be viewed as being part of fundamental principles of nature, as they are evident in the mathematical expression of 2n matrices in group representations SU(n), or the superposition of two states of one particle 21, two states of two (or n) particles leading to 22 (or 2n ) possible combinations. We may show that these fundamental principles are linked to the newly developed 5D projection field theory and the realization of matter as proposed by Wong [3]. [1] C.A. Perry, Thesis 1989, University of Kansas [2] G. Dreschhoff, Adv. Space Res., 40, p. 1015-1020, 2007 [3] K.W. Wong, Nova Science, 2009, in press

  4. Levitated Optomechanics for Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Rashid, Muddassar; Bateman, James; Vovrosh, Jamie; Hempston, David; Ulbricht, Hendrik

    2015-05-01

    Optomechanics with levitated nano- and microparticles is believed to form a platform for testing fundamental principles of quantum physics, as well as find applications in sensing. We will report on a new scheme to trap nanoparticles, which is based on a parabolic mirror with a numerical aperture of 1. Combined with achromatic focussing, the setup is a cheap and readily straightforward solution to trapping nanoparticles for further study. Here, we report on the latest progress made in experimentation with levitated nanoparticles; these include the trapping of 100 nm nanodiamonds (with NV-centres) down to 1 mbar as well as the trapping of 50 nm Silica spheres down to 10?4 mbar without any form of feedback cooling. We will also report on the progress to implement feedback stabilisation of the centre of mass motion of the trapped particle using digital electronics. Finally, we argue that such a stabilised particle trap can be the particle source for a nanoparticle matterwave interferometer. We will present our Talbot interferometer scheme, which holds promise to test the quantum superposition principle in the new mass range of 106 amu. EPSRC, John Templeton Foundation.

  5. Al-Air Batteries: Fundamental Thermodynamic Limitations from First-Principles Theory.

    PubMed

    Chen, Leanne D; Nørskov, Jens K; Luntz, Alan C

    2015-01-02

    The Al-air battery possesses high theoretical specific energy (4140 W h/kg) and is therefore an attractive candidate for vehicle propulsion. However, the experimentally observed open-circuit potential is much lower than what bulk thermodynamics predicts, and this potential loss is typically attributed to corrosion. Similarly, large Tafel slopes associated with the battery are assumed to be due to film formation. We present a detailed thermodynamic study of the Al-air battery using density functional theory. The results suggest that the maximum open-circuit potential of the Al anode is only -1.87 V versus the standard hydrogen electrode at pH 14.6 instead of the traditionally assumed -2.34 V and that large Tafel slopes are inherent in the electrochemistry. These deviations from the bulk thermodynamics are intrinsic to the electrochemical surface processes that define Al anodic dissolution. This has contributions from both asymmetry in multielectron transfers and, more importantly, a large chemical stabilization inherent to the formation of bulk Al(OH)3 from surface intermediates. These are fundamental limitations that cannot be improved even if corrosion and film effects are completely suppressed.

  6. The Holographic Principle and the Emergence of Spacetime

    NASA Astrophysics Data System (ADS)

    Rosenhaus, Vladimir

    Results within string theory and quantum gravity suggest that spacetime is not fundamental but rather emergent, with the fundamental degrees of freedom living on a boundary surface of one lower dimension than the bulk. This thesis is devoted to studying the holographic principle and its realization for spacetimes with both negative and positive cosmological constant. The holographic principle is most explicitly realized in the context of the AdS/CFT correspondence. We examine the extent to which AdS/CFT realizes the holographic principle and study the UV/IR relation. We study aspects of how bulk locality emerges within AdS/CFT. To this effect, we study how to reconstruct the bulk from boundary data. We study how such a reconstruction procedure is sensitive to large changes in the bulk geometry. We study if it is possible to reconstruct a subset of the bulk from a subset of the boundary data. We explore both local and nonlocal CFT quantities as probes of the bulk. One nonlocal quantity is entanglement entropy, and to this effect we construct a framework for computing entanglement entropy within the field theory. The most ambitious application of the holographic principle would be finding the holographic dual to the multiverse. We investigate properties of this putative duality. We extend the UV/IR relation of AdS/CFT to the multiverse, with the UV cutoff of the theory on future infinity being dual to a late time cutoff (measure) in the bulk. We compare various measure proposals and examine their predictions.

  7. Developing a Dynamics and Vibrations Course for Civil Engineering Students Based on Fundamental-Principles

    ERIC Educational Resources Information Center

    Barroso, Luciana R.; Morgan, James R.

    2012-01-01

    This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…

  8. Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited

    NASA Astrophysics Data System (ADS)

    Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong

    2018-03-01

    By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.

  9. Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited

    NASA Astrophysics Data System (ADS)

    Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong

    2018-07-01

    By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.

  10. Interfaces at equilibrium: A guide to fundamentals.

    PubMed

    Marmur, Abraham

    2017-06-01

    The fundamentals of the thermodynamics of interfaces are reviewed and concisely presented. The discussion starts with a short review of the elements of bulk thermodynamics that are also relevant to interfaces. It continues with the interfacial thermodynamics of two-phase systems, including the definition of interfacial tension and adsorption. Finally, the interfacial thermodynamics of three-phase (wetting) systems is discussed, including the topic of non-wettable surfaces. A clear distinction is made between equilibrium conditions, in terms of minimizing energies (internal, Gibbs or Helmholtz), and equilibrium indicators, in terms of measurable, intrinsic properties (temperature, chemical potential, pressure). It is emphasized that the equilibrium indicators are the same whatever energy is minimized, if the boundary conditions are properly chosen. Also, to avoid a common confusion, a distinction is made between systems of constant volume and systems with drops of constant volume. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Conscious visual memory with minimal attention.

    PubMed

    Pinto, Yair; Vandenbroucke, Annelinde R; Otten, Marte; Sligte, Ilja G; Seth, Anil K; Lamme, Victor A F

    2017-02-01

    Is conscious visual perception limited to the locations that a person attends? The remarkable phenomenon of change blindness, which shows that people miss nearly all unattended changes in a visual scene, suggests the answer is yes. However, change blindness is found after visual interference (a mask or a new scene), so that subjects have to rely on working memory (WM), which has limited capacity, to detect the change. Before such interference, however, a much larger capacity store, called fragile memory (FM), which is easily overwritten by newly presented visual information, is present. Whether these different stores depend equally on spatial attention is central to the debate on the role of attention in conscious vision. In 2 experiments, we found that minimizing spatial attention almost entirely erases visual WM, as expected. Critically, FM remains largely intact. Moreover, minimally attended FM responses yield accurate metacognition, suggesting that conscious memory persists with limited spatial attention. Together, our findings help resolve the fundamental issue of how attention affects perception: Both visual consciousness and memory can be supported by only minimal attention. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. Computed tomographic angiography in stroke imaging: fundamental principles, pathologic findings, and common pitfalls.

    PubMed

    Gupta, Rajiv; Jones, Stephen E; Mooyaart, Eline A Q; Pomerantz, Stuart R

    2006-06-01

    The development of multidetector row computed tomography (MDCT) now permits visualization of the entire vascular tree that is relevant for the management of stroke within 15 seconds. Advances in MDCT have brought computed tomography angiography (CTA) to the frontline in evaluation of stroke. CTA is a rapid and noninvasive modality for evaluating the neurovasculature. This article describes the role of CTA in the management of stroke. Fundamentals of contrast delivery, common pathologic findings, artifacts, and pitfalls in CTA interpretation are discussed.

  13. Violation of ethical principles in clinical research. Influences and possible solutions for Latin America.

    PubMed

    Cornejo Moreno, Borys Alberto; Gómez Arteaga, Gress Marissell

    2012-12-16

    Even though we are now well into the 21st century and notwithstanding all the abuse to individuals involved in clinical studies that has been documented throughout History, fundamental ethical principles continue to be violated in one way or another. Here are some of the main factors that contribute to the abuse of subjects participating in clinical trials: paternalism, improper use of informed consent, lack of strict ethical supervision, pressure exerted by health institutions to increase the production of scientific material, and the absence of legislation regarding ethics in terms of health care and research. Are researchers ready to respect fundamental ethical principles in light of the ample window of information provided by individual genomes, while defending the rights of the subjects participating in clinical studies as a major priority? As one of the possible solutions to this problem, education regarding fundamental ethical principles is suggested for participants in research studies as an initial method of cognitive training in ethics, together with the promotion of ethical behavior in order to encourage the adoption of reasonable policies in the field of values, attitudes and behavior.

  14. Compression as a Universal Principle of Animal Behavior

    ERIC Educational Resources Information Center

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J.; Semple, Stuart

    2013-01-01

    A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the…

  15. Synthetic Biology: Engineering Living Systems from Biophysical Principles.

    PubMed

    Bartley, Bryan A; Kim, Kyung; Medley, J Kyle; Sauro, Herbert M

    2017-03-28

    Synthetic biology was founded as a biophysical discipline that sought explanations for the origins of life from chemical and physical first principles. Modern synthetic biology has been reinvented as an engineering discipline to design new organisms as well as to better understand fundamental biological mechanisms. However, success is still largely limited to the laboratory and transformative applications of synthetic biology are still in their infancy. Here, we review six principles of living systems and how they compare and contrast with engineered systems. We cite specific examples from the synthetic biology literature that illustrate these principles and speculate on their implications for further study. To fully realize the promise of synthetic biology, we must be aware of life's unique properties. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Image denoising by a direct variational minimization

    NASA Astrophysics Data System (ADS)

    Janev, Marko; Atanacković, Teodor; Pilipović, Stevan; Obradović, Radovan

    2011-12-01

    In this article we introduce a novel method for the image de-noising which combines a mathematically well-posdenes of the variational modeling with the efficiency of a patch-based approach in the field of image processing. It based on a direct minimization of an energy functional containing a minimal surface regularizer that uses fractional gradient. The minimization is obtained on every predefined patch of the image, independently. By doing so, we avoid the use of an artificial time PDE model with its inherent problems of finding optimal stopping time, as well as the optimal time step. Moreover, we control the level of image smoothing on each patch (and thus on the whole image) by adapting the Lagrange multiplier using the information on the level of discontinuities on a particular patch, which we obtain by pre-processing. In order to reduce the average number of vectors in the approximation generator and still to obtain the minimal degradation, we combine a Ritz variational method for the actual minimization on a patch, and a complementary fractional variational principle. Thus, the proposed method becomes computationally feasible and applicable for practical purposes. We confirm our claims with experimental results, by comparing the proposed method with a couple of PDE-based methods, where we get significantly better denoising results specially on the oscillatory regions.

  17. Leakage of the fundamental mode in photonic crystal fiber tapers.

    PubMed

    Nguyen, Hong C; Kuhlmey, Boris T; Steel, Michael J; Smith, Cameron L; Mägi, Eric C; McPhedran, Ross C; Eggleton, Benjamin J

    2005-05-15

    We report detailed measurements of the optical properties of tapered photonic crystal fibers (PCFs). We observe a striking long-wavelength loss as the fiber diameter is reduced, despite the minimal airhole collapse along the taper. We associate this loss with a transition of the fundamental core mode as the fiber dimensions contract: At wavelengths shorter than this transition wavelength, the core mode is strongly confined in the fiber microstructure, whereas at longer wavelengths the mode expands beyond the microstructure and couples out to higher-order modes. These experimental results are discussed in the context of the so-called fundamental mode cutoff described by Kuhlmey et al. [Opt. Express 10, 1285 (2002)], which apply to PCFs with a finite microstructure.

  18. Klein bottle logophysics: a unified principle for non-linear systems, cosmology, geophysics, biology, biomechanics and perception

    NASA Astrophysics Data System (ADS)

    Lucio Rapoport, Diego

    2013-04-01

    We present a unified principle for science that surmounts dualism, in terms of torsion fields and the non-orientable surfaces, notably the Klein Bottle and its logic, the Möbius strip and the projective plane. We apply it to the complex numbers and cosmology, to non-linear systems integrating the issue of hyperbolic divergences with the change of orientability, to the biomechanics of vision and the mammal heart, to the morphogenesis of crustal shapes on Earth in connection to the wavefronts of gravitation, elasticity and electromagnetism, to pattern recognition of artificial images and visual recognition, to neurology and the topographic maps of the sensorium, to perception, in particular of music. We develop it in terms of the fundamental 2:1 resonance inherent to the Möbius strip and the Klein Bottle, the minimal surfaces representation of the wavefronts, and the non-dual Klein Bottle logic inherent to pattern recognition, to the harmonic functions and vector fields that lay at the basis of geophysics and physics at large. We discuss the relation between the topographic maps of the sensorium, and the issue of turning inside-out of the visual world as a general principle for cognition, topological chemistry, cell biology and biological morphogenesis in particular in embryology

  19. Limited access atrial septal defect closure and the evolution of minimally invasive surgery.

    PubMed

    Izzat, M B; Yim, A P; El-Zufari, M H

    1998-04-01

    While minimizing the "invasiveness" in general surgery has been equated with minimizing "access", what constitutes minimally invasive intra-cardiac surgery remains controversial. Many surgeons doubt the benefits of minimizing access when the need for cardiopulmonary bypass cannot be waived. Recognizing that median sternotomy itself does entail significant morbidity, we investigated the value of alternative approaches to median sternotomy using atrial septal defect closure as our investigative model. We believe that some, but not all minimal access approaches are associated with reduced postoperative morbidity and enhanced recovery. Our current strategy is to use a mini-sternotomy approach in adult patients, whereas conventional median sternotomy remains our standard approach in the pediatric population. Considerable clinical experiences coupled with documented clinical benefits are fundamental before a certain approach is adopted in routine practice.

  20. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the

  1. [Principles of fast track surgery. Multimodal perioperative therapy programme].

    PubMed

    Kehlet, H

    2009-08-01

    Recent evidence has documented that a combination of single-modality evidence-based care principles into a multimodal effort to enhance postoperative recovery (the fast track methodology) has led to enhanced recovery with reduced medical morbidity, need for hospitalisation and convalescence. Nevertheless, general implementation of fast track surgery has been relatively slow despite concomitant economic benefits. Further improvement in postoperative outcome may be obtained by developments within each care principle with a specific focus on minimally invasive surgery, effective multimodal, non-opioid analgesia and pharmacological stress reduction.

  2. Fundamental ecology is fundamental.

    PubMed

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Representations in Dynamical Embodied Agents: Re-Analyzing a Minimally Cognitive Model Agent

    ERIC Educational Resources Information Center

    Mirolli, Marco

    2012-01-01

    Understanding the role of "representations" in cognitive science is a fundamental problem facing the emerging framework of embodied, situated, dynamical cognition. To make progress, I follow the approach proposed by an influential representational skeptic, Randall Beer: building artificial agents capable of minimally cognitive behaviors and…

  4. Use of minimal invasive extracorporeal circulation in cardiac surgery: principles, definitions and potential benefits. A position paper from the Minimal invasive Extra-Corporeal Technologies international Society (MiECTiS)

    PubMed Central

    Anastasiadis, Kyriakos; Murkin, John; Antonitsis, Polychronis; Bauer, Adrian; Ranucci, Marco; Gygax, Erich; Schaarschmidt, Jan; Fromes, Yves; Philipp, Alois; Eberle, Balthasar; Punjabi, Prakash; Argiriadou, Helena; Kadner, Alexander; Jenni, Hansjoerg; Albrecht, Guenter; van Boven, Wim; Liebold, Andreas; de Somer, Fillip; Hausmann, Harald; Deliopoulos, Apostolos; El-Essawi, Aschraf; Mazzei, Valerio; Biancari, Fausto; Fernandez, Adam; Weerwind, Patrick; Puehler, Thomas; Serrick, Cyril; Waanders, Frans; Gunaydin, Serdar; Ohri, Sunil; Gummert, Jan; Angelini, Gianni; Falk, Volkmar; Carrel, Thierry

    2016-01-01

    Minimal invasive extracorporeal circulation (MiECC) systems have initiated important efforts within science and technology to further improve the biocompatibility of cardiopulmonary bypass components to minimize the adverse effects and improve end-organ protection. The Minimal invasive Extra-Corporeal Technologies international Society was founded to create an international forum for the exchange of ideas on clinical application and research of minimal invasive extracorporeal circulation technology. The present work is a consensus document developed to standardize the terminology and the definition of minimal invasive extracorporeal circulation technology as well as to provide recommendations for the clinical practice. The goal of this manuscript is to promote the use of MiECC systems into clinical practice as a multidisciplinary strategy involving cardiac surgeons, anaesthesiologists and perfusionists. PMID:26819269

  5. Al-Air Batteries: Fundamental Thermodynamic Limitations from First Principles Theory

    NASA Astrophysics Data System (ADS)

    Chen, Leanne D.; Noerskov, Jens K.; Luntz, Alan C.

    2015-03-01

    The Al-air battery possesses high theoretical specific energy (4140 Wh/kg) and is therefore an attractive candidate for vehicle propulsion applications. However, the experimentally observed open-circuit potential is much lower than what thermodynamics predicts, and this potential loss is widely believed to be an effect of corrosion. We present a detailed study of the Al-air battery using density functional theory. The results suggest that the difference between bulk thermodynamic and surface potentials is due to both the effects of asymmetry in multi-electron transfer reactions that define the anodic dissolution of Al and, more importantly, a large chemical step inherent to the formation of bulk Al(OH)3 from surface intermediates. The former results in an energy loss of 3%, while the latter accounts for 14 -29% of the total thermodynamic energy depending on the surface site where dissolution occurs. Therefore, the maximum open-circuit potential of the Al anode is only -1.87 V vs. SHE in the absence of thermal excitations, contrary to -2.34 V predicted by bulk thermodynamics at pH 14.6. This is a fundamental limitation of the system and governs the maximum output potential, which cannot be improved even if corrosion effects were completely suppressed. Supported by the Natural Sciences and Engineering Research Council of Canada and the ReLiable Project (#11-116792) funded by the Danish Council for Strategic Research.

  6. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications

    PubMed Central

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B.; Quidde, Julia; Shen, Francis H.; Chapman, Jens R.; Samartzis, Dino

    2015-01-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of “personalized spine care.” PMID:26225283

  7. Fundamental Understanding of Propellant/Nozzle Interaction for Rocket Nozzle Erosion Minimization Under Very High Pressure Conditions

    DTIC Science & Technology

    2005-08-31

    conditions; with X-ray radiography for erosion rate measurements. A vortex combustor was also designed to simulate propellant product species and to...DATES COVERED Interim Progress Report, August 1, 2004 to July 31, 2005 4. TITLE AND SUBTITLE Fundamental Understanding of Propellant /Nozzle...nozzle erosion by solid- propellant combustion products. Several processes can affect the nozzle erosion rate at high pressure and temperature

  8. Structural building principles of complex face-centered cubic intermetallics.

    PubMed

    Dshemuchadse, Julia; Jung, Daniel Y; Steurer, Walter

    2011-08-01

    Fundamental structural building principles are discussed for all 56 known intermetallic phases with approximately 400 or more atoms per unit cell and space-group symmetry F43m, Fd3m, Fd3, Fm3m or Fm3c. Despite fundamental differences in chemical composition, bonding and electronic band structure, their complex crystal structures show striking similarities indicating common building principles. We demonstrate that the structure-determining elements are flat and puckered atomic {110} layers stacked with periodicities 2p. The atoms on this set of layers, which intersect each other, form pentagon face-sharing endohedral fullerene-like clusters arranged in a face-centered cubic packing (f.c.c.). Due to their topological layer structure, all these crystal structures can be described as (p × p × p) = p(3)-fold superstructures of a common basic structure of the double-diamond type. The parameter p, with p = 3, 4, 7 or 11, is determined by the number of layers per repeat unit and the type of cluster packing, which in turn are controlled by chemical composition.

  9. What is Quantum Mechanics? A Minimal Formulation

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2018-03-01

    This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

  10. Design principles and algorithms for automated air traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    1995-01-01

    This paper presents design principles and algorithm for building a real time scheduler. The primary objective of the scheduler is to assign arrival aircraft to a favorable landing runway and schedule them to land at times that minimize delays. A further objective of the scheduler is to allocate delays between high altitude airspace far from the airport and low altitude airspace near the airport. A method of delay allocation is described that minimizes the average operating cost in the presence of errors in controlling aircraft to a specified landing time.

  11. Water System Adaptation To Hydrological Changes: Module 7, Adaptation Principles and Considerations

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  12. Metaphysics of the principle of least action

    NASA Astrophysics Data System (ADS)

    Terekhovich, Vladislav

    2018-05-01

    Despite the importance of the variational principles of physics, there have been relatively few attempts to consider them for a realistic framework. In addition to the old teleological question, this paper continues the recent discussion regarding the modal involvement of the principle of least action and its relations with the Humean view of the laws of nature. The reality of possible paths in the principle of least action is examined from the perspectives of the contemporary metaphysics of modality and Leibniz's concept of essences or possibles striving for existence. I elaborate a modal interpretation of the principle of least action that replaces a classical representation of a system's motion along a single history in the actual modality by simultaneous motions along an infinite set of all possible histories in the possible modality. This model is based on an intuition that deep ontological connections exist between the possible paths in the principle of least action and possible quantum histories in the Feynman path integral. I interpret the action as a physical measure of the essence of every possible history. Therefore only one actual history has the highest degree of the essence and minimal action. To address the issue of necessity, I assume that the principle of least action has a general physical necessity and lies between the laws of motion with a limited physical necessity and certain laws with a metaphysical necessity.

  13. Anatomical principles for minimally invasive reconstruction of the acromioclavicular joint with anchors.

    PubMed

    Xiong, Chuanzhi; Lu, Yaojia; Wang, Qiang; Chen, Gang; Hu, Hansheng; Lu, Zhihua

    2016-11-01

    The aim of this study was to evaluate the outcome of a minimally invasive surgical technique for the treatment of patients with acromioclavicular joint dislocation. Sixteen patients with complete acromioclavicular joint dislocation were enrolled in this study. All patients were asked to follow the less active rehabilitation protocol post-operatively. Computed tomography with 3-D reconstruction of the injured shoulder was performed on each patient post operatively for the assessment of the accuracy of the suture anchor placement in the coracoid process and the reduction of the acromioclavicular joint. Radiographs of Zanca view and axillary view of both shoulders were taken for evaluating the maintenance of the acromioclavicular joint reduction at each follow-up visit. The Constant shoulder score was used for function assessment at the final follow-up. Twenty seven of the 32 anchors implanted in the coracoid process met the criteria of good position. One patient developed complete loss of reduction and another had partial loss of reduction in the anteroposterior plane. For the other 14 patients, the mean Constant score was 90 (range, 82-95). For the patients with partial and complete loss of reduction, the Constant score were 92 and 76 respectively. All of them got nearly normal range of motion of the shoulders and restored to pre-operative life and works. With this minimally invasive approach and limited exposure of the coracoid, a surgeon can place the suture anchors at the anatomical insertions of the coracoclavicular ligament and allow the dislocated joint reduced and maintained well. Level IV, Case series; therapeutic study.

  14. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  15. The Didactic Principles and Their Applications in the Didactic Activity

    ERIC Educational Resources Information Center

    Marius-Costel, Esi

    2010-01-01

    The evaluation and reevaluation of the fundamental didactic principles suppose the acceptance at the level of an instructive-educative activity of a new educational paradigm. Thus, its understanding implies an assumption at a conceptual-theoretical level of some approaches where the didactic aspects find their usefulness by relating to value…

  16. The principle of procreative beneficence: old arguments and a new challenge.

    PubMed

    Hotke, Andrew

    2014-06-01

    In the last ten years, there have been a number of attempts to refute Julian Savulescu's Principle of Procreative Beneficence; a principle which claims that parents have a moral obligation to have the best child that they can possibly have. So far, no arguments against this principle have succeeded at refuting it. This paper tries to explain the shortcomings of some of the more notable arguments against this principle. I attempt to break down the argument for the principle and in doing so, I explain what is needed to properly refute it. This helps me show how and why the arguments of Rebecca Bennett, Sarah Stoller and others fail to refute the principle. Afterwards, I offer a new challenge to the principle. I attack what I understand to be a fundamental premise of the argument, a premise which has been overlooked in the literature written about this principle. I argue that there is no reason to suppose, as Savulescu does, that morality requires us to do what we have most reason to do. If we reject this premise, as I believe we have reason to do, the argument for Procreative Beneficence fails. © 2012 John Wiley & Sons Ltd.

  17. Limits on fundamental limits to computation.

    PubMed

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  18. Deterministic and stochastic algorithms for resolving the flow fields in ducts and networks using energy minimization

    NASA Astrophysics Data System (ADS)

    Sochi, Taha

    2016-09-01

    Several deterministic and stochastic multi-variable global optimization algorithms (Conjugate Gradient, Nelder-Mead, Quasi-Newton and global) are investigated in conjunction with energy minimization principle to resolve the pressure and volumetric flow rate fields in single ducts and networks of interconnected ducts. The algorithms are tested with seven types of fluid: Newtonian, power law, Bingham, Herschel-Bulkley, Ellis, Ree-Eyring and Casson. The results obtained from all those algorithms for all these types of fluid agree very well with the analytically derived solutions as obtained from the traditional methods which are based on the conservation principles and fluid constitutive relations. The results confirm and generalize the findings of our previous investigations that the energy minimization principle is at the heart of the flow dynamics systems. The investigation also enriches the methods of computational fluid dynamics for solving the flow fields in tubes and networks for various types of Newtonian and non-Newtonian fluids.

  19. Fundamental Quantum 1/F Noise in Ultrasmall Semiconductor Devices and Their Optimal Design Principles

    DTIC Science & Technology

    1988-05-31

    Hooge parameter. 2. 1 / f Noise of the Recombination Current Generated in the Depletion Region The quantum i/ f ...theory. There are two forms of quantum 11f noise . In the first place C~ and Cn4 p n to quantum 1 / f noise theory. This would yield Hooge parameters S...Fundamental Quantum 1 / f Noise in Ultrasmall S~ iodcrD’vesadOtm.Dsgn P in. 12. PERSONAL AUTHOR(S) Handel, Peter H. (Princioal investiaat r) 13a. TYPE

  20. Non-minimal Higgs inflation and frame dependence in cosmology

    NASA Astrophysics Data System (ADS)

    Steinwachs, Christian F.; Kamenshchik, Alexander Yu.

    2013-02-01

    We investigate a very general class of cosmological models with scalar fields non-minimally coupled to gravity. A particular representative in this class is given by the non-minimal Higgs inflation model in which the Standard Model Higgs boson and the inflaton are described by one and the same scalar particle. While the predictions of the non-minimal Higgs inflation scenario come numerically remarkably close to the recently discovered mass of the Higgs boson, there remains a conceptual problem in this model that is associated with the choice of the cosmological frame. While the classical theory is independent of this choice, we find by an explicit calculation that already the first quantum corrections induce a frame dependence. We give a geometrical explanation of this frame dependence by embedding it into a more general field theoretical context. From this analysis, some conceptional points in the long lasting cosmological debate: "Jordan frame vs. Einstein frame" become more transparent and in principle can be resolved in a natural way.

  1. Principle of Spacetime and Black Hole Equivalence

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxi

    2016-06-01

    Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.

  2. Communication: Fitting potential energy surfaces with fundamental invariant neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energymore » surfaces for OH{sub 3} and CH{sub 4} were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations.« less

  3. Supramolecular chemistry and chemical warfare agents: from fundamentals of recognition to catalysis and sensing.

    PubMed

    Sambrook, M R; Notman, S

    2013-12-21

    Supramolecular chemistry presents many possible avenues for the mitigation of the effects of chemical warfare agents (CWAs), including sensing, catalysis and sequestration. To-date, efforts in this field both to study fundamental interactions between CWAs and to design and exploit host systems remain sporadic. In this tutorial review the non-covalent recognition of CWAs is considered from first principles, including taking inspiration from enzymatic systems, and gaps in fundamental knowledge are indicated. Examples of synthetic systems developed for the recognition of CWAs are discussed with a focus on the supramolecular complexation behaviour and non-covalent approaches rather than on the proposed applications.

  4. Competency Test Items for Applied Principles of Agribusiness and Natural Resources Occupations. Forestry Component. A Report of Research.

    ERIC Educational Resources Information Center

    Cheek, Jimmy G.; McGhee, Max B.

    An activity was undertaken to develop written criterion-referenced tests for the forestry component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three components, with…

  5. Preferential flow, connectivity and the principle of "minimum time to equilibrium": a new perspective on environmental water flow

    NASA Astrophysics Data System (ADS)

    Zehe, E.; Blume, T.; Bloeschl, G.

    2008-12-01

    Preferential/rapid flow and transport is known as one key process in soil hydrology for more than 20 years. It seems to be rather the rule, than the exception. It occurs in soils, in surface rills and river networks. If connective preferential are present at any scale, they crucially control water flow and solute transport. Why? Is there an underlying principle? If energy is conserved a system follows Fermat's principle of minimum action i.e. it follows the trajectory that minimise the integral of the total energy/ La Grangian over time. Hydrological systems are, however, non-conservative as surface and subsurface water flows dissipate energy. From thermodynamics it is well known that natural processes minimize the free energy of the system. For hydrological systems we suggest, therefore, that flow in a catchment arranges in such a way that time to a minimum of free energy becomes minimal for a given rainfall input (disturbance) and under given constraints. Free energy in a soil is determined by potential energy and capillary energy. The pore size distribution of the soil, soil structures, depth to groundwater and most important vegetation make up the constraints. The pore size distribution determines whether potential energy or capillarity dominates the free energy of the soil system. The first term is minimal when the pore space is completely de-saturated the latter becomes minimal at soil saturation. Hence, the soil determines a) the amount of excess (gravity) water that has to be exported from the soil to reach a minimum state of free energy and b) whether redistribution or groundwater recharge is more efficient to reach that equilibrium. On the other hand, the pore size distribution of the soil and the connectivity of preferential pathways (root channels, worm holes and cracks) determine flow velocities and the redistribution of water within the pore space. As water flow and ground water recharge are fast in sandy soils and capillary energy is of minor

  6. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  7. Circular motion geometry using minimal data.

    PubMed

    Jiang, Guang; Quan, Long; Tsui, Hung-Tat

    2004-06-01

    Circular motion or single axis motion is widely used in computer vision and graphics for 3D model acquisition. This paper describes a new and simple method for recovering the geometry of uncalibrated circular motion from a minimal set of only two points in four images. This problem has been previously solved using nonminimal data either by computing the fundamental matrix and trifocal tensor in three images or by fitting conics to tracked points in five or more images. It is first established that two sets of tracked points in different images under circular motion for two distinct space points are related by a homography. Then, we compute a plane homography from a minimal two points in four images. After that, we show that the unique pair of complex conjugate eigenvectors of this homography are the image of the circular points of the parallel planes of the circular motion. Subsequently, all other motion and structure parameters are computed from this homography in a straighforward manner. The experiments on real image sequences demonstrate the simplicity, accuracy, and robustness of the new method.

  8. Dynamical minimalism: why less is more in psychology.

    PubMed

    Nowak, Andrzej

    2004-01-01

    The principle of parsimony, embraced in all areas of science, states that simple explanations are preferable to complex explanations in theory construction. Parsimony, however, can necessitate a trade-off with depth and richness in understanding. The approach of dynamical minimalism avoids this trade-off. The goal of this approach is to identify the simplest mechanisms and fewest variables capable of producing the phenomenon in question. A dynamical model in which change is produced by simple rules repetitively interacting with each other can exhibit unexpected and complex properties. It is thus possible to explain complex psychological and social phenomena with very simple models if these models are dynamic. In dynamical minimalist theories, then, the principle of parsimony can be followed without sacrificing depth in understanding. Computer simulations have proven especially useful for investigating the emergent properties of simple models.

  9. Gravitational waves in theories with a non-minimal curvature-matter coupling

    NASA Astrophysics Data System (ADS)

    Bertolami, Orfeu; Gomes, Cláudio; Lobo, Francisco S. N.

    2018-04-01

    Gravitational waves in the presence of a non-minimal curvature-matter coupling are analysed, both in the Newman-Penrose and perturbation theory formalisms. Considering a cosmological constant as a source, the non-minimally coupled matter-curvature model reduces to f( R) theories. This is in good agreement with the most recent data. Furthermore, a dark energy-like fluid is briefly considered, where the propagation equation for the tensor modes differs from the previous scenario, in that the scalar mode equation has an extra term, which can be interpreted as the longitudinal mode being the result of the mixture of two fundamental excitations δ R and δ ρ.

  10. A pilot study designed to acquaint medical educators with basic pedagogic principles.

    PubMed

    McLeod, P J; Brawer, J; Steinert, Y; Chalk, C; McLeod, A

    2008-02-01

    Faculty development activities in medical schools regularly target teaching behaviours but rarely address basic pedagogic principles underlying those behaviours. Although many teachers have an intuitive or tacit knowledge of basic pedagogic principles, overt knowledge of fundamental educational principles is rare. We conducted a short-term pilot study designed to transform teachers' tacit knowledge into explicit knowledge of pedagogic principles. We hypothesized that conscious awareness of these principles will positively influence their teaching effectiveness. The intervention included a workshop, provision of a workbook on pedagogic principles and free access to educational consultants. For the intervention, we chose a purposive sample of experienced teachers at our medical school. Evaluation of the impact of the intervention using questionnaires and semi-structured interviews revealed three notable findings; 1. Participants were surprised to discover the existence of an extensive body of pedagogic science underlying teaching and learning. 2. They were enthusiastic about the intervention and expressed interest in learning more about basic pedagogic principles. 3. The knowledge acquired had an immediate impact on their teaching.

  11. Orthopaedic traumatology: fundamental principles and current controversies for the acute care surgeon

    PubMed Central

    Pharaon, Shad K; Schoch, Shawn; Marchand, Lucas; Mirza, Amer

    2018-01-01

    Multiply injured patients with fractures are co-managed by acute care surgeons and orthopaedic surgeons. In most centers, orthopaedic surgeons definitively manage fractures, but preliminary management, including washouts, splinting, reductions, and external fixations, may be performed by selected acute care surgeons. The acute care surgeon should have a working knowledge of orthopaedic terminology to communicate with colleagues effectively. They should have an understanding of the composition of bone, periosteum, and cartilage, and their reaction when there is an injury. Fractures are usually fixed urgently, but some multiply injured patients are better served with a damage control strategy. Extremity compartment syndrome should be suspected in all critically injured patients with or without fractures and a low threshold for compartment pressure measurements or empiric fasciotomy maintained. Acute care surgeons performing rib fracture fixation and other chest wall injury reconstructions should follow the principles of open fracture reduction and stabilization. PMID:29766123

  12. Orthopaedic traumatology: fundamental principles and current controversies for the acute care surgeon.

    PubMed

    Pharaon, Shad K; Schoch, Shawn; Marchand, Lucas; Mirza, Amer; Mayberry, John

    2018-01-01

    Multiply injured patients with fractures are co-managed by acute care surgeons and orthopaedic surgeons. In most centers, orthopaedic surgeons definitively manage fractures, but preliminary management, including washouts, splinting, reductions, and external fixations, may be performed by selected acute care surgeons. The acute care surgeon should have a working knowledge of orthopaedic terminology to communicate with colleagues effectively. They should have an understanding of the composition of bone, periosteum, and cartilage, and their reaction when there is an injury. Fractures are usually fixed urgently, but some multiply injured patients are better served with a damage control strategy. Extremity compartment syndrome should be suspected in all critically injured patients with or without fractures and a low threshold for compartment pressure measurements or empiric fasciotomy maintained. Acute care surgeons performing rib fracture fixation and other chest wall injury reconstructions should follow the principles of open fracture reduction and stabilization.

  13. Competency Test Items for Applied Principles of Agribusiness and Natural Resources Occupations. Agricultural Production Component. A Report of Research.

    ERIC Educational Resources Information Center

    Cheek, Jimmy G.; McGhee, Max B.

    An activity was undertaken to develop written criterion-referenced tests for the agricultural production component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist…

  14. A New Principle of Sound Frequency Analysis

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore

    1932-01-01

    In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.

  15. Principles for integrating reactive species into in vivo biological processes: Examples from exercise physiology.

    PubMed

    Margaritelis, Nikos V; Cobley, James N; Paschalis, Vassilis; Veskoukis, Aristidis S; Theodorou, Anastasios A; Kyparos, Antonios; Nikolaidis, Michalis G

    2016-04-01

    The equivocal role of reactive species and redox signaling in exercise responses and adaptations is an example clearly showing the inadequacy of current redox biology research to shed light on fundamental biological processes in vivo. Part of the answer probably relies on the extreme complexity of the in vivo redox biology and the limitations of the currently applied methodological and experimental tools. We propose six fundamental principles that should be considered in future studies to mechanistically link reactive species production to exercise responses or adaptations: 1) identify and quantify the reactive species, 2) determine the potential signaling properties of the reactive species, 3) detect the sources of reactive species, 4) locate the domain modified and verify the (ir)reversibility of post-translational modifications, 5) establish causality between redox and physiological measurements, 6) use selective and targeted antioxidants. Fulfilling these principles requires an idealized human experimental setting, which is certainly a utopia. Thus, researchers should choose to satisfy those principles, which, based on scientific evidence, are most critical for their specific research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. [Management of spinal metastasis by minimal invasive surgery technique: Surgical principles, indications: A literature review].

    PubMed

    Toquart, A; Graillon, T; Mansouri, N; Adetchessi, T; Blondel, B; Fuentes, S

    2016-06-01

    Spinal metastasis are getting more frequent. This raises the question of pain and neurological complications, which worsen the functional and survival prognosis of this oncological population patients. The surgical treatment must be the most complete as possible: to decompress and stabilize without delaying the management of the oncological disease. Minimal invasive surgery techniques are by definition, less harmful on musculocutaneous plan than opened ones, with a comparable efficiency demonstrated in degenerative and traumatic surgery. So they seem to be applicable and appropriate to this patient population. We detailed different minimal invasive techniques proposed in the management of spinal metastasis. For this, we used our experience developed in degenerative and traumatic pathologies, and we also referred to many authors, establishing a literature review thanks to Pubmed, Embase. Thirty eight articles were selected and allowed us to describe different techniques: percutaneous methods such as vertebro-/kyphoplasty and osteosynthesis, as well as mini-opened surgery, through a posterior or anterior way. We propose a surgical approach using these minimal invasive techniques, first according to the predominant symptom (pain or neurologic failure), then characteristics of the lesions (number, topography, type…) and the deformity degree. Whatever the technique, the main goal is to stabilize and decompress, in order to maintain a good quality of life for these fragile patients, without delaying the medical management of the oncological disease. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  17. Fundamental role of bistability in optimal homeostatic control

    NASA Astrophysics Data System (ADS)

    Wang, Guanyu

    2013-03-01

    Bistability is a fundamental phenomenon in nature and has a number of fine properties. However, these properties are consequences of bistability at the physiological level, which do not explain why it had to emerge during evolution. Using optimal homeostasis as the first principle and Pontryagin's Maximum Principle as the optimization approach, I find that bistability emerges as an indispensable control mechanism. Because the mathematical model is general and the result is independent of parameters, it is likely that most biological systems use bistability to control homeostasis. Glucose homeostasis represents a good example. It turns out that bistability is the only solution to a dilemma in glucose homeostasis: high insulin efficiency is required for rapid plasma glucose clearance, whereas an insulin sparing state is required to guarantee the brain's safety during fasting. This new perspective can illuminate studies on the twin epidemics of obesity and diabetes and the corresponding intervening strategies. For example, overnutrition and sedentary lifestyle may represent sudden environmental changes that cause the lose of optimality, which may contribute to the marked rise of obesity and diabetes in our generation.

  18. Findings in Experimental Psychology as Functioning Principles of Theatrical Design.

    ERIC Educational Resources Information Center

    Caldwell, George

    A gestalt approach to theatrical design seems to provide some ready and stable explanations for a number of issues in the scenic arts. Gestalt serves as the theoretical base for a number of experiments in psychology whose findings appear to delineate the principles of art to be used in scene design. The fundamental notion of gestalt theory…

  19. Competency Test Items for Applied Principles of Agribusiness and Natural Resources Occupations. Agricultural Mechanics Component. A Report of Research.

    ERIC Educational Resources Information Center

    Cheek, Jimmy G.; McGhee, Max B.

    An activity was undertaken to develop written criterion-referenced tests for the agricultural mechanics component of the Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…

  20. Competency Test Items for Applied Principles of Agribusiness and Natural Resources Occupations. Common Core Component. A Report of Research.

    ERIC Educational Resources Information Center

    Cheek, Jimmy G.; McGhee, Max B.

    An activity was undertaken to develop written criterion-referenced tests for the common core component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…

  1. Competency Test Items for Applied Principles of Agribusiness and Natural Resources Occupations. Agricultural Resources Component. A Report of Research.

    ERIC Educational Resources Information Center

    Cheek, Jimmy G.; McGhee, Max B.

    An activity was undertaken to develop written criterion-referenced tests for the agricultural resources component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…

  2. Applications of minimal physiologically-based pharmacokinetic models

    PubMed Central

    Cao, Yanguang

    2012-01-01

    Conventional mammillary models are frequently used for pharmacokinetic (PK) analysis when only blood or plasma data are available. Such models depend on the quality of the drug disposition data and have vague biological features. An alternative minimal-physiologically-based PK (minimal-PBPK) modeling approach is proposed which inherits and lumps major physiologic attributes from whole-body PBPK models. The body and model are represented as actual blood and tissue usually total body weight) volumes, fractions (fd) of cardiac output with Fick’s Law of Perfusion, tissue/blood partitioning (Kp), and systemic or intrinsic clearance. Analyzing only blood or plasma concentrations versus time, the minimal-PBPK models parsimoniously generate physiologically-relevant PK parameters which are more easily interpreted than those from mam-millary models. The minimal-PBPK models were applied to four types of therapeutic agents and conditions. The models well captured the human PK profiles of 22 selected beta-lactam antibiotics allowing comparison of fitted and calculated Kp values. Adding a classical hepatic compartment with hepatic blood flow allowed joint fitting of oral and intravenous (IV) data for four hepatic elimination drugs (dihydrocodeine, verapamil, repaglinide, midazolam) providing separate estimates of hepatic intrinsic clearance, non-hepatic clearance, and pre-hepatic bioavailability. The basic model was integrated with allometric scaling principles to simultaneously describe moxifloxacin PK in five species with common Kp and fd values. A basic model assigning clearance to the tissue compartment well characterized plasma concentrations of six monoclonal antibodies in human subjects, providing good concordance of predictions with expected tissue kinetics. The proposed minimal-PBPK modeling approach offers an alternative and more rational basis for assessing PK than compartmental models. PMID:23179857

  3. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  4. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

  5. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  6. A Greatly Under-Appreciated Fundamental Principle of Physical Organic Chemistry

    PubMed Central

    Cox, Robin A.

    2011-01-01

    If a species does not have a finite lifetime in the reaction medium, it cannot be a mechanistic intermediate. This principle was first enunciated by Jencks, as the concept of an enforced mechanism. For instance, neither primary nor secondary carbocations have long enough lifetimes to exist in an aqueous medium, so SN1 reactions involving these substrates are not possible, and an SN2 mechanism is enforced. Only tertiary carbocations and those stabilized by resonance (benzyl cations, acylium ions) are stable enough to be reaction intermediates. More importantly, it is now known that neither H3O+ nor HO− exist as such in dilute aqueous solution. Several recent high-level calculations on large proton clusters are unable to localize the positive charge; it is found to be simply “on the cluster” as a whole. The lifetime of any ionized water species is exceedingly short, a few molecular vibrations at most; the best experimental study, using modern IR instrumentation, has the most probable hydrated proton structure as H13O6+, but only an estimated quarter of the protons are present even in this form at any given instant. Thanks to the Grotthuss mechanism of chain transfer along hydrogen bonds, in reality a proton or a hydroxide ion is simply instantly available anywhere it is needed for reaction. Important mechanistic consequences result. Any charged oxygen species (e.g., a tetrahedral intermediate) is also not going to exist long enough to be a reaction intermediate, unless the charge is stabilized in some way, usually by resonance. General acid catalysis is the rule in reactions in concentrated aqueous acids. The Grotthuss mechanism also means that reactions involving neutral water are favored; the solvent is already highly structured, so the entropy involved in bringing several solvent molecules to the reaction center is unimportant. Examples are given. PMID:22272074

  7. FUNDAMENTALS LEARNING LABORATORIES IN INDUSTRIAL EDUCATION CENTERS, TECHNICAL INSTITUTES AND COMMUNITY COLLEGES IN NORTH CAROLINA.

    ERIC Educational Resources Information Center

    MARTIN, WALTER TRAVIS, JR.

    IN 1964, NORTH CAROLINA ESTABLISHED A SYSTEM OF "FUNDAMENTALS LEARNING LABORATORIES" WHERE ADULTS MIGHT OBTAIN PROGRAMED SELF-INSTRUCTION AT MINIMAL COST (A $2.00 REGISTRATION FEE). IN A DESCRIPTIVE STUDY OF THE 17 LABORATORIES OPERATING IN 1965, DATA WERE GATHERED BY QUESTIONNAIRES AND INTERVIEWS. FINDINGS INCLUDED THE FOLLOWING-- (1)…

  8. Large-scale evidence of dependency length minimization in 37 languages

    PubMed Central

    Futrell, Richard; Mahowald, Kyle; Gibson, Edward

    2015-01-01

    Explaining the variation between human languages and the constraints on that variation is a core goal of linguistics. In the last 20 y, it has been claimed that many striking universals of cross-linguistic variation follow from a hypothetical principle that dependency length—the distance between syntactically related words in a sentence—is minimized. Various models of human sentence production and comprehension predict that long dependencies are difficult or inefficient to process; minimizing dependency length thus enables effective communication without incurring processing difficulty. However, despite widespread application of this idea in theoretical, empirical, and practical work, there is not yet large-scale evidence that dependency length is actually minimized in real utterances across many languages; previous work has focused either on a small number of languages or on limited kinds of data about each language. Here, using parsed corpora of 37 diverse languages, we show that overall dependency lengths for all languages are shorter than conservative random baselines. The results strongly suggest that dependency length minimization is a universal quantitative property of human languages and support explanations of linguistic variation in terms of general properties of human information processing. PMID:26240370

  9. How Do Students in an Innovative Principle-Based Mechanics Course Understand Energy Concepts?

    ERIC Educational Resources Information Center

    Ding, Lin; Chabay, Ruth; Sherwood, Bruce

    2013-01-01

    We investigated students' conceptual learning of energy topics in an innovative college-level introductory mechanics course, entitled Matter & Interactions (M&I) Modern Mechanics. This course differs from traditional curricula in that it emphasizes application of a small number of fundamental principles across various scales, involving…

  10. Tracking the deployment of the integrated metropolitan intelligent transportation systems infrastructure in the USA : FY 1997 results

    DOT National Transportation Integrated Search

    2002-01-01

    The essence of effective environmental justice practice is summarized in three fundamental principles: (1) Avoid, minimize, or mitigate disproportionately high and adverse human health and environmental effects, including social and economic effects,...

  11. Meta-Chirality: Fundamentals, Construction and Applications

    PubMed Central

    Ma, Xiaoliang; Pu, Mingbo; Li, Xiong; Guo, Yinghui; Gao, Ping; Luo, Xiangang

    2017-01-01

    Chiral metamaterials represent a special type of artificial structures that cannot be superposed to their mirror images. Due to the lack of mirror symmetry, cross-coupling between electric and magnetic fields exist in chiral mediums and present unique electromagnetic characters of circular dichroism and optical activity, which provide a new opportunity to tune polarization and realize negative refractive index. Chiral metamaterials have attracted great attentions in recent years and have given rise to a series of applications in polarization manipulation, imaging, chemical and biological detection, and nonlinear optics. Here we review the fundamental theory of chiral media and analyze the construction principles of some typical chiral metamaterials. Then, the progress in extrinsic chiral metamaterials, absorbing chiral metamaterials, and reconfigurable chiral metamaterials are summarized. In the last section, future trends in chiral metamaterials and application in nonlinear optics are introduced. PMID:28513560

  12. Relativities of fundamentality

    NASA Astrophysics Data System (ADS)

    McKenzie, Kerry

    2017-08-01

    S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.

  13. Challenging the bioethical application of the autonomy principle within multicultural societies.

    PubMed

    Fagan, Andrew

    2004-01-01

    This article critically re-examines the application of the principle of patient autonomy within bioethics. In complex societies such as those found in North America and Europe health care professionals are increasingly confronted by patients from diverse ethnic, cultural, and religious backgrounds. This affects the relationship between clinicians and patients to the extent that patients' deliberations upon the proposed courses of treatment can, in various ways and to varying extents, be influenced by their ethnic, cultural, and religious commitments. The principle of patient autonomy is the main normative constraint imposed upon medical treatment. Bioethicists typically appeal to the principle of patient autonomy as a means for generally attempting to resolve conflict between patients and clinicians. In recent years a number of bioethicists have responded to the condition of multiculturalism by arguing that the autonomy principle provides the basis for a common moral discourse capable of regulating the relationship between clinicians and patients in those situations where patients' beliefs and commitments do or may contradict the ethos of biomedicine. This article challenges that claim. I argue that the precise manner in which the autonomy principle is philosophically formulated within such accounts prohibits bioethicists' deployment of autonomy as a core ideal for a common moral discourse within multicultural societies. The formulation of autonomy underlying such accounts cannot be extended to simply assimilate individuals' most fundamental religious and cultural commitments and affiliations per se. I challenge the assumption that respecting prospective patients' fundamental religious and cultural commitments is necessarily always compatible with respecting their autonomy. I argue that the character of some peoples' relationship with their cultural or religious community acts to significantly constrain the possibilities for acting autonomously. The implication is

  14. Equivalency principle for magnetoelectroelastic multiferroics with arbitrary microstructure: The phase field approach

    NASA Astrophysics Data System (ADS)

    Ni, Yong; He, Linghui; Khachaturyan, Armen G.

    2010-07-01

    A phase field method is proposed to determine the equilibrium fields of a magnetoelectroelastic multiferroic with arbitrarily distributed constitutive constants under applied loadings. This method is based on a developed generalized Eshelby's equivalency principle, in which the elastic strain, electrostatic, and magnetostatic fields at the equilibrium in the original heterogeneous system are exactly the same as those in an equivalent homogeneous magnetoelectroelastic coupled or uncoupled system with properly chosen distributed effective eigenstrain, polarization, and magnetization fields. Finding these effective fields fully solves the equilibrium elasticity, electrostatics, and magnetostatics in the original heterogeneous multiferroic. The paper formulates a variational principle proving that the effective fields are minimizers of appropriate close-form energy functional. The proposed phase field approach produces the energy minimizing effective fields (and thus solving the general multiferroic problem) as a result of artificial relaxation process described by the Ginzburg-Landau-Khalatnikov kinetic equations.

  15. Invisible Ink Revealed: Concept, Context, and Chemical Principles of "Cold War" Writing

    ERIC Educational Resources Information Center

    Macrakis, Kristie; Bell, Elizabeth K.; Perry, Dale L.; Sweeder, Ryan D.

    2012-01-01

    By modifying secret writing formulas uncovered from the archives of the East German Ministry of State Security (MfS or Stasi), a novel general chemistry secret writing laboratory was developed. The laboratory combines science and history that highlights several fundamental chemical principles related to the writing. These include catalysis, redox…

  16. Is Memory Search Governed by Universal Principles or Idiosyncratic Strategies?

    PubMed Central

    Healey, M. Karl; Kahana, Michael J.

    2013-01-01

    Laboratory paradigms have provided an empirical foundation for much of psychological science. Some have argued, however, that such paradigms are highly susceptible to idiosyncratic strategies and that rather than reflecting fundamental cognitive principles, many findings are artifacts of averaging across participants who employ different strategies. We develop a set of techniques to rigorously test the extent to which average data are distorted by such strategy differences and apply these techniques to free recall data from the Penn Electrophysiology of Encoding and Retrieval Study (PEERS). Recall initiation showed evidence of subgroups: the majority of participants initiate recall from the last item in the list, but one subgroup show elevated initiation probabilities for items 2–4 back from the end of the list and another showed elevated probabilities for the beginning of the list. By contrast, serial position curves and temporal and semantic clustering functions were remarkably consistent, with almost every participant exhibiting a recognizable version of the average function, suggesting that these functions reflect fundamental principles of the memory system. The approach taken here can serve as a model for evaluating the extent to which other laboratory paradigms are influenced by individual differences in strategy use. PMID:23957279

  17. Twelve Principles for Green Energy Storage in Grid Applications.

    PubMed

    Arbabzadeh, Maryam; Johnson, Jeremiah X; Keoleian, Gregory A; Rasmussen, Paul G; Thompson, Levi T

    2016-01-19

    The introduction of energy storage technologies to the grid could enable greater integration of renewables, improve system resilience and reliability, and offer cost effective alternatives to transmission and distribution upgrades. The integration of energy storage systems into the electrical grid can lead to different environmental outcomes based on the grid application, the existing generation mix, and the demand. Given this complexity, a framework is needed to systematically inform design and technology selection about the environmental impacts that emerge when considering energy storage options to improve sustainability performance of the grid. To achieve this, 12 fundamental principles specific to the design and grid application of energy storage systems are developed to inform policy makers, designers, and operators. The principles are grouped into three categories: (1) system integration for grid applications, (2) the maintenance and operation of energy storage, and (3) the design of energy storage systems. We illustrate the application of each principle through examples published in the academic literature, illustrative calculations, and a case study with an off-grid application of vanadium redox flow batteries (VRFBs). In addition, trade-offs that can emerge between principles are highlighted.

  18. A structured policy review of the principles of professional self-regulation.

    PubMed

    Benton, D C; González-Jurado, M A; Beneit-Montesinos, J V

    2013-03-01

    The International Council of Nurses (ICN) has, for many years, based its work on professional self-regulation on a set of 12 principles. These principles are research based and were identified nearly three decades ago. ICN has conducted a number of reviews of the principles; however, changes have been minimal. In the past 5-10 years, a number of authors and governments, often as part of the review of regulatory systems, have started to propose principles to guide the way regulatory frameworks are designed and implemented. These principles vary in number and content. This study examines the current policy literature on principle-based regulation and compares this with the set of principles advocated by the ICN. A systematic search of the literature on principle-based regulation is used as the basis for a qualitative thematic analysis to compare and contrast the 12 principles of self-regulation with more recently published work. A mapping of terms based on a detailed description of the principles used in the various research and policy documents was generated. This mapping forms the basis of a critique of the current ICN principles. A professional self-regulation advocated by the ICN were identified. A revised and extended set of 13 principles is needed if contemporary developments in the field of regulatory frameworks are to be accommodated. These revised principles should be considered for adoption by the ICN to underpin their advocacy work on professional self-regulation. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.

  19. First-principles definition and measurement of planetary electromagnetic-energy budget.

    PubMed

    Mishchenko, Michael I; Lock, James A; Lacis, Andrew A; Travis, Larry D; Cairns, Brian

    2016-06-01

    The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  20. First-principles definition and measurement of planetary electromagnetic-energy budget

    NASA Astrophysics Data System (ADS)

    Mishchenko, M. I.; James, L.; Lacis, A. A.; Travis, L. D.; Cairns, B.

    2016-12-01

    The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this talk we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated concepts of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  1. First-Principles Definition and Measurement of Planetary Electromagnetic-Energy Budget

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Lock, James A.; Lacis, Andrew A.; Travis, Larry D.; Cairns, Brian

    2016-01-01

    The imperative to quantify the Earths electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting- vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.

  2. [Wilson's principles--a base of modern teratology].

    PubMed

    Burdan, Franciszek; Bełzek, Artur; Szumiło, Justyna; Dudka, Jarosław; Korobowicz, Agnieszka; Tokarska, Edyta; Klepacz, Lidia; Bełzek, Marta; Klepacz, Robert

    2006-03-01

    Wilson's principles were formulated after thalidomide tragedy. They become a fundamental for teratological studies with drugs and other factors that may disturb fetal development. It is postulated that susceptibility to teratogen depends on the genotype and developmental stage of the conceptus. Teratogenic agents act in specific manner on developing cells and tissues. The exposition depends on the agent's nature and availability. Manifestations of deviant development depends on the dosage and exposure frequency. In case of abnormal development the final manifestations include death of embryo or fetus, malformation, growth retardation and functional disorder.

  3. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  4. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  5. Fundamental Quantum 1/F Noise in Ultrasmall Semi Conductor Devices and Their Optimal Design Principles.

    DTIC Science & Technology

    1986-05-01

    1 . quantum 1 / f noise t - 12 . In that case the Hooge parameter0(H may be written H...Eqs. (4.2)-(4.5). The Hooge formula 2 0 is thus derived from first =.% principles as a quantum 1 / f result withOH given by Eq. (4.12). All i/ f noise ...between coherent state I/ f noise and the Umklapp I/ f noise . 1 / f noise in n+-p Hgl-xCdxTe occurs in many forms and each form should be tested. If a Hooge

  6. The biological default state of cell proliferation with variation and motility, a fundamental principle for a theory of organisms.

    PubMed

    Soto, Ana M; Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos

    2016-10-01

    The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin's "descent with modification". Although a "default state" is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. The biological default state of cell proliferation with variation and motility, a fundamental principle for a theory of organisms

    PubMed Central

    SOTO, ANA M.; LONGO, GIUSEPPE; Montévil, Maël; SONNENSCHEIN, CARLOS

    2017-01-01

    The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin’s “descent with modification”. Although a “default state” is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. PMID:27381480

  8. Knowledge and Attitude among General Dental Practitioners towards Minimally Invasive Dentistry in Riyadh and AlKharj

    PubMed Central

    Sheddi, Faisal Mohammed; Alharqan, Mesfer Saad; Khawja, Shabnam Gulzar; Vohra, Fahim; Akram, Zohaib; Faden, Asmaa Ahmed; Khalil, Hesham Saleh

    2016-01-01

    Introduction Minimally Invasive Dentistry (MID) emphasizes conservative caries management strategies resulting in less destruction of tooth structure, a deviation of the traditional GV Black’s restorative principles. However, there seems to be either deficiency in knowledge or little intention by general dental practitioners to adopt these principles. Aim The aim of this study was to assess the knowledge and attitude among general dental practitioners towards minimally invasive dentistry in Riyadh and AlKharj cities of Saudi Arabia. Materials and Methods Self-administered structured questionnaires were handed to general dental practitioners (GDPs) in the cities of Riyadh and AlKharj in Saudi Arabia. Several questions, including Likert-type scale response categories (1–5), were used. The questions assessed the respondents’ levels of agreement regarding diagnostic, preventive and restorative techniques such as use of caries risk assessment, use of high fluoride tooth paste, Atraumatic Restorative Treatment and tunnel preparations. Results Out of 200 respondents, 161 GDPs with overall response rate of 80.5% completed the questionnaires. The GDPs showed significantly different approach with regards to the use of sharp explorer for caries detection (p = 0.014). Almost 60% of the participants had received no special education regarding minimally invasive procedures. Moreover, GDPs who had received MID training showed significantly better knowledge and attitude in adopting minimally invasive techniques for both diagnosis and treatment of dental caries. Conclusion Although GDPs possess knowledge about the benefits of MID; however, study showed deficiencies in their attitudes towards caries detection methods and application of minimally invasive dentistry procedures. PMID:27630962

  9. A minimal model for the structural energetics of VO2

    NASA Astrophysics Data System (ADS)

    Kim, Chanul; Marianetti, Chris; The Marianetti Group Team

    Resolving the structural, magnetic, and electronic structure of VO2 from the first-principles of quantum mechanics is still a forefront problem despite decades of attention. Hybrid functionals have been shown to qualitatively ruin the structural energetics. While density functional theory (DFT) combined with cluster extensions of dynamical mean-field theory (DMFT) have demonstrated promising results in terms of the electronic properties, structural phase stability has not yet been addressed. In order to capture the basic physics of the structural transition, we propose a minimal model of VO2 based on the one dimensional Peierls-Hubbard model and parameterize this based on DFT calculations of VO2. The total energy versus dimerization in the minimal mode is then solved numerically exactly using density matrix renormalization group (DMRG) and compared to the Hartree-Fock solution. We demonstrate that the Hartree-Fock solution exhibits the same pathologies as DFT+U, and spin density functional theory for that matter, while the DMRG solution is consistent with experimental observation. Our results demonstrate the critical role of non-locality in the total energy, and this will need to be accounted for to obtain a complete description of VO2 from first-principles. The authors acknowledge support from FAME, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA.

  10. Minimally Invasive Surgery for the Treatment of Colorectal Cancer

    PubMed Central

    Karcz, W. Konrad; von Braun, William

    2016-01-01

    Background Reduction in operative trauma along with an improvement in endoscopic access has undoubtedly occupied surgical minds for at least the past 3 decades. It is not at all surprising that minimally invasive colon surgery has come a long way since the first laparoscopic appendectomy by Semm in 1981. It is common knowledge that the recent developments in video and robotic technologies have significantly furthered advancements in laparoscopic and minimally invasive surgery. This has led to the overall acceptance of the treatment of benign colorectal pathology via the endoscopic route. Malignant disease, however, is still primarily treated by conventional approaches. Methods and Results This review article is based on a literature search pertaining to advances in minimally invasive colorectal surgery for the treatment of malignant pathology, as well as on personal experience in the field over the same period of time. Our search was limited to level I and II clinical papers only, according to the evidence-based medicine guidelines. We attempted to present our unbiased view on the subject relying only on the evidence available. Conclusion Focusing on advances in colorectal minimally invasive surgery, it has to be stated that there are still a number of unanswered questions regarding the surgical management of malignant diseases with this approach. These questions do not only relate to the area of boundaries set for the use of minimally invasive techniques in this field but also to the exact modality best suited to the treatment of every particular case whilst maintaining state-of-the-art oncological principles. PMID:27493947

  11. AHRQ series paper 2: principles for developing guidance: AHRQ and the effective health-care program.

    PubMed

    Helfand, Mark; Balshem, Howard

    2010-05-01

    This article describes some of the fundamental principles that have been developed to guide the work of producing comparative effectiveness reviews (CERs). We briefly describe the role stakeholders play in providing important insights that inform the evidence-gathering process, and discuss the critical role of analytic frameworks in illuminating the relationship between surrogate measures and health outcomes, providing an understanding of the context in which clinical decisions are made and the uncertainties that underlie clinical controversies. We describe the Effective Health Care program conceptual model for considering different types of evidence that emphasizes minimizing the risk of bias, but places high-quality, highly applicable evidence about effectiveness at the top of the hierarchy. Finally, we briefly describe areas of future methodological research. CERs have become a foundation for decision-making in clinical practice and health policy. To be useful, CERs must approach the evidence from a patient-centered perspective; explore the clinical logic underlying the rationale for a service; cast a broad net with respect to types of evidence, placing a high value on effectiveness and applicability, in addition to internal validity; and, present benefits and harms for treatments and tests in a consistent way.

  12. A Brief Statement on Educational Principle of the People's Republic of China

    ERIC Educational Resources Information Center

    Yang, Tianping

    2005-01-01

    China issued two educational principles in 1960s and 1990s respectively. Both are composed of three fundamental elements--general aim of education, concrete purposes of education and ways of realizing them. Both emphasize the combination of education and work as well as the all-round development of the educatee. The difference between the two is…

  13. Ethics in oncology: principles and responsibilities declared in the Italian Ragusa statement.

    PubMed

    Gori, Stefania; Pinto, Carmine; Caminiti, Caterina; Aprile, Giuseppe; Marchetti, Paolo; Perrone, Francesco; Di Maio, Massimo; Omodeo Salè, Emanuela; Mancuso, Annamaria; De Cicco, Maurizio; Di Costanzo, Francesco; Crispino, Sergio; Passalacqua, Rodolfo; Merlano, Marco; Zagonel, Vittorina; Fioretto, Luisa; Micallo, Giovanni; Labianca, Roberto; Bordonaro, Roberto; Comandone, Alessandro; Spinsanti, Sandro; Iacono, Carmelo; Nicolis, Fabrizio

    2016-12-01

    Cancer care involves many ethical issues. The need for more patient-centered healthcare together with the improved empowerment of every person diagnosed with cancer have been transposed by the Italian Association of Medical Oncology (AIOM) and eventually translated in the Ragusa statement. This position paper describes the philosophy that lies beneath this document and its fundamental principles.

  14. Pain Now or Later: An Outgrowth Account of Pain-Minimization

    PubMed Central

    Chen, Shuai; Zhao, Dan; Rao, Li-Lin; Liang, Zhu-Yuan; Li, Shu

    2015-01-01

    The preference for immediate negative events contradicts the minimizing loss principle given that the value of a delayed negative event is discounted by the amount of time it is delayed. However, this preference is understandable if we assume that the value of a future outcome is not restricted to the discounted utility of the outcome per se but is complemented by an anticipated negative utility assigned to an unoffered dimension, which we termed the “outgrowth.” We conducted three studies to establish the existence of the outgrowth and empirically investigated the mechanism underlying the preference for immediate negative outcomes. Study 1 used a content analysis method to examine whether the outgrowth was generated in accompaniment with the delayed negative events. The results revealed that the investigated outgrowth was composed of two elements. The first component is the anticipated negative emotions elicited by the delayed negative event, and the other is the anticipated rumination during the waiting process, in which one cannot stop thinking about the negative event. Study 2 used a follow-up investigation to examine whether people actually experienced the negative emotions they anticipated in a real situation of waiting for a delayed negative event. The results showed that the participants actually experienced a number of negative emotions when waiting for a negative event. Study 3 examined whether the existence of the outgrowth could make the minimizing loss principle work. The results showed that the difference in pain anticipation between the immediate event and the delayed event could significantly predict the timing preference of the negative event. Our findings suggest that people’s preference for experiencing negative events sooner serves to minimize the overall negative utility, which is divided into two parts: the discounted utility of the outcome itself and an anticipated negative utility assigned to the outgrowth. PMID:25747461

  15. Nanotechnology: Fundamental Principles and Applications

    NASA Astrophysics Data System (ADS)

    Ranjit, Koodali T.; Klabunde, Kenneth J.

    Nanotechnology research is based primarily on molecular manufacturing. Although several definitions have been widely used in the past to describe the field of nanotechnology, it is worthwhile to point out that the National Nanotechnology Initiative (NNI), a federal research and development scheme approved by the congress in 2001 defines nanotechnology only if the following three aspects are involved: (1) research and technology development at the atomic, molecular, or macromolecular levels, in the length scale of approximately 1-100 nanometer range, (2) creating and using structures, devices, and systems that have novel properties and functions because of their small and/or intermediate size, and (3) ability to control or manipulate on the atomic scale. Nanotechnology in essence is the technology based on the manipulation of individual atoms and molecules to build complex structures that have atomic specifications.

  16. Real time selective harmonic minimization for multilevel inverters using genetic algorithm and artifical neural network angle generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Filho, Faete J; Tolbert, Leon M; Ozpineci, Burak

    2012-01-01

    The work developed here proposes a methodology for calculating switching angles for varying DC sources in a multilevel cascaded H-bridges converter. In this approach the required fundamental is achieved, the lower harmonics are minimized, and the system can be implemented in real time with low memory requirements. Genetic algorithm (GA) is the stochastic search method to find the solution for the set of equations where the input voltages are the known variables and the switching angles are the unknown variables. With the dataset generated by GA, an artificial neural network (ANN) is trained to store the solutions without excessive memorymore » storage requirements. This trained ANN then senses the voltage of each cell and produces the switching angles in order to regulate the fundamental at 120 V and eliminate or minimize the low order harmonics while operating in real time.« less

  17. Productivity limits and potentials of the principles of conservation agriculture.

    PubMed

    Pittelkow, Cameron M; Liang, Xinqiang; Linquist, Bruce A; van Groenigen, Kees Jan; Lee, Juhwan; Lundy, Mark E; van Gestel, Natasja; Six, Johan; Venterea, Rodney T; van Kessel, Chris

    2015-01-15

    One of the primary challenges of our time is to feed a growing and more demanding world population with reduced external inputs and minimal environmental impacts, all under more variable and extreme climate conditions in the future. Conservation agriculture represents a set of three crop management principles that has received strong international support to help address this challenge, with recent conservation agriculture efforts focusing on smallholder farming systems in sub-Saharan Africa and South Asia. However, conservation agriculture is highly debated, with respect to both its effects on crop yields and its applicability in different farming contexts. Here we conduct a global meta-analysis using 5,463 paired yield observations from 610 studies to compare no-till, the original and central concept of conservation agriculture, with conventional tillage practices across 48 crops and 63 countries. Overall, our results show that no-till reduces yields, yet this response is variable and under certain conditions no-till can produce equivalent or greater yields than conventional tillage. Importantly, when no-till is combined with the other two conservation agriculture principles of residue retention and crop rotation, its negative impacts are minimized. Moreover, no-till in combination with the other two principles significantly increases rainfed crop productivity in dry climates, suggesting that it may become an important climate-change adaptation strategy for ever-drier regions of the world. However, any expansion of conservation agriculture should be done with caution in these areas, as implementation of the other two principles is often challenging in resource-poor and vulnerable smallholder farming systems, thereby increasing the likelihood of yield losses rather than gains. Although farming systems are multifunctional, and environmental and socio-economic factors need to be considered, our analysis indicates that the potential contribution of no-till to the

  18. Deciphering principles of transcription regulation in eukaryotic genomes

    PubMed Central

    Nguyen, Dat H; D'haeseleer, Patrik

    2006-01-01

    Transcription regulation has been responsible for organismal complexity and diversity in the course of biological evolution and adaptation, and it is determined largely by the context-dependent behavior of cis-regulatory elements (CREs). Therefore, understanding principles underlying CRE behavior in regulating transcription constitutes a fundamental objective of quantitative biology, yet these remain poorly understood. Here we present a deterministic mathematical strategy, the motif expression decomposition (MED) method, for deriving principles of transcription regulation at the single-gene resolution level. MED operates on all genes in a genome without requiring any a priori knowledge of gene cluster membership, or manual tuning of parameters. Applying MED to Saccharomyces cerevisiae transcriptional networks, we identified four functions describing four different ways that CREs can quantitatively affect gene expression levels. These functions, three of which have extrema in different positions in the gene promoter (short-, mid-, and long-range) whereas the other depends on the motif orientation, are validated by expression data. We illustrate how nature could use these principles as an additional dimension to amplify the combinatorial power of a small set of CREs in regulating transcription. PMID:16738557

  19. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  20. Quantum scattering in one-dimensional systems satisfying the minimal length uncertainty relation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph

    In quantum gravity theories, when the scattering energy is comparable to the Planck energy the Heisenberg uncertainty principle breaks down and is replaced by the minimal length uncertainty relation. In this paper, the consequences of the minimal length uncertainty relation on one-dimensional quantum scattering are studied using an approach involving a recently proposed second-order differential equation. An exact analytical expression for the tunneling probability through a locally-periodic rectangular potential barrier system is obtained. Results show that the existence of a non-zero minimal length uncertainty tends to shift the resonant tunneling energies to the positive direction. Scattering through a locally-periodic potentialmore » composed of double-rectangular potential barriers shows that the first band of resonant tunneling energies widens for minimal length cases when the double-rectangular potential barrier is symmetric but narrows down when the double-rectangular potential barrier is asymmetric. A numerical solution which exploits the use of Wronskians is used to calculate the transmission probabilities through the Pöschl–Teller well, Gaussian barrier, and double-Gaussian barrier. Results show that the probability of passage through the Pöschl–Teller well and Gaussian barrier is smaller in the minimal length cases compared to the non-minimal length case. For the double-Gaussian barrier, the probability of passage for energies that are more positive than the resonant tunneling energy is larger in the minimal length cases compared to the non-minimal length case. The approach is exact and applicable to many types of scattering potential.« less

  1. Guiding principles for student leadership development in the doctor of pharmacy program to assist administrators and faculty members in implementing or refining curricula.

    PubMed

    Traynor, Andrew P; Boyle, Cynthia J; Janke, Kristin K

    2013-12-16

    To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought.

  2. Dirac δ -function potential in quasiposition representation of a minimal-length scenario

    NASA Astrophysics Data System (ADS)

    Gusson, M. F.; Gonçalves, A. Oakes O.; Francisco, R. O.; Furtado, R. G.; Fabris, J. C.; Nogueira, J. A.

    2018-03-01

    A minimal-length scenario can be considered as an effective description of quantum gravity effects. In quantum mechanics the introduction of a minimal length can be accomplished through a generalization of Heisenberg's uncertainty principle. In this scenario, state eigenvectors of the position operator are no longer physical states and the representation in momentum space or a representation in a quasiposition space must be used. In this work, we solve the Schroedinger equation with a Dirac δ -function potential in quasiposition space. We calculate the bound state energy and the coefficients of reflection and transmission for the scattering states. We show that leading corrections are of order of the minimal length ({ O}(√{β })) and the coefficients of reflection and transmission are no longer the same for the Dirac delta well and barrier as in ordinary quantum mechanics. Furthermore, assuming that the equivalence of the 1s state energy of the hydrogen atom and the bound state energy of the Dirac {{δ }}-function potential in the one-dimensional case is kept in a minimal-length scenario, we also find that the leading correction term for the ground state energy of the hydrogen atom is of the order of the minimal length and Δx_{\\min } ≤ 10^{-25} m.

  3. Principles of Pesticide Use, Handling, and Application: Instructional Modules for Vocational Agriculture Education. Student Manual.

    ERIC Educational Resources Information Center

    Ellis Associates, Inc., College Park, MD.

    This training package is designed to present the basic principles of pesticide use, handling, and application. Included in this package is information on federal laws and regulations, personal safety, environmental implications, storage and disposal considerations, proper application procedures, and fundamentals of pest management. Successful…

  4. Principles of Pesticide Use, Handling, and Application: Instructional Modules for Vocational Agriculture Education. Teacher Manual.

    ERIC Educational Resources Information Center

    Ellis Associates, Inc., College Park, MD.

    The training package is designed to present the basic principles of pesticide use, handling, and application. Included in this package is information on Federal laws and regulations, personal safety, environmental implications, storage and disposal considerations, proper application procedures, and fundamentals of pest management. Successful…

  5. Recent Advances and Future Prospects in Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Plaster, Brad

    2017-09-01

    A broad program of initiatives in fundamental symmetries seeks answers to several of the most pressing open questions in nuclear physics, ranging from the scale of the neutrino mass, to the particle-antiparticle nature of the neutrino, to the origin of the matter-antimatter asymmetry, to the limits of Standard Model interactions. Although the experimental program is quite broad, with efforts ranging from precision measurements of neutrino properties; to searches for electric dipole moments; to precision measurements of magnetic dipole moments; and to precision measurements of couplings, particle properties, and decays; all of these seemingly disparate initiatives are unified by several common threads. These include the use and exploitation of symmetry principles, novel cross-disciplinary experimental work at the forefront of the precision frontier, and the need for accompanying breakthroughs in development of the theory necessary for an interpretation of the anticipated results from these experiments. This talk will highlight recent accomplishments and advances in fundamental symmetries and point to the extraordinary level of ongoing activity aimed at realizing the development and interpretation of next-generation experiments. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Award Number DE-SC-0014622.

  6. Fundamentals of bipolar high-frequency surgery.

    PubMed

    Reidenbach, H D

    1993-04-01

    In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.

  7. Cognitive radio adaptation for power consumption minimization using biogeography-based optimization

    NASA Astrophysics Data System (ADS)

    Qi, Pei-Han; Zheng, Shi-Lian; Yang, Xiao-Niu; Zhao, Zhi-Jin

    2016-12-01

    Adaptation is one of the key capabilities of cognitive radio, which focuses on how to adjust the radio parameters to optimize the system performance based on the knowledge of the radio environment and its capability and characteristics. In this paper, we consider the cognitive radio adaptation problem for power consumption minimization. The problem is formulated as a constrained power consumption minimization problem, and the biogeography-based optimization (BBO) is introduced to solve this optimization problem. A novel habitat suitability index (HSI) evaluation mechanism is proposed, in which both the power consumption minimization objective and the quality of services (QoS) constraints are taken into account. The results show that under different QoS requirement settings corresponding to different types of services, the algorithm can minimize power consumption while still maintaining the QoS requirements. Comparison with particle swarm optimization (PSO) and cat swarm optimization (CSO) reveals that BBO works better, especially at the early stage of the search, which means that the BBO is a better choice for real-time applications. Project supported by the National Natural Science Foundation of China (Grant No. 61501356), the Fundamental Research Funds of the Ministry of Education, China (Grant No. JB160101), and the Postdoctoral Fund of Shaanxi Province, China.

  8. Guiding principles of safety as a basis for developing a pharmaceutical safety culture.

    PubMed

    Edwards, Brian; Olsen, Axel K; Whalen, Matthew D; Gold, Marla J

    2007-05-01

    Despite the best efforts of industry and regulatory authorities, the trust of society in the process of medicine development and communication of pharmaceutical risk has ebbed away. In response the US government has called for a culture of compliance while the EU regulators talk of a 'culture of scientific excellence'. However, one of the fundamental problems hindering progress to rebuilding trust based on a pharmaceutical safety culture is the lack of agreement and transparency between all stakeholders as to what is meant by a 'Safety of Medicines'. For that reason, we propose 'Guiding Principles of Safety for Pharmaceuticals' are developed analogous to the way that Chemical Safety has been tackled. A logical starting point would be to examine the Principles outlined by the US Institute of Medicine although we acknowledge that these Principles require further extensive debate and definition. Nevertheless, the Principles should take centre stage in the reform of pharmaceutical development required to restore society's trust.

  9. Tether fundamentals

    NASA Technical Reports Server (NTRS)

    Carroll, J. A.

    1986-01-01

    Some fundamental aspects of tethers are presented and briefly discussed. The effects of gravity gradients, dumbbell libration in circular orbits, tether control strategies and impact hazards for tethers are among those fundamentals. Also considered are aerodynamic drag, constraints in momentum transfer applications and constraints with permanently deployed tethers. The theoretical feasibility of these concepts are reviewed.

  10. Engaging Accounting Students: How to Teach Principles of Accounting in Creative and Exciting Ways

    ERIC Educational Resources Information Center

    Jaijairam, Paul

    2012-01-01

    Many students in secondary and post-secondary institutions generally have a difficult time grasping the concepts of accounting. This article contends that it is not the subject matter that is dry, but rather the methods in which faculty have traditionally presented accounting fundamentals and principles. Departing from standard lectures and…

  11. Corrected black hole thermodynamics in Damour-Ruffini’s method with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Chen, Ge-Rui

    Recently, some approaches to quantum gravity indicate that a minimal measurable length lp ˜ 10-35 should be considered, a direct implication of the minimal measurable length is the generalized uncertainty principle (GUP). Taking the effect of GUP into account, Hawking radiation of massless scalar particles from a Schwarzschild black hole is investigated by the use of Damour-Ruffini’s method. The original Klein-Gordon equation is modified. It is obtained that the corrected Hawking temperature is related to the energy of emitting particles. Some discussions appear in the last section.

  12. Prediction of Metabolic Flux Distribution from Gene Expression Data Based on the Flux Minimization Principle

    DTIC Science & Technology

    2014-11-14

    problem. Modification of the PLOS ONE | www.plosone.org 1 November 2014 | Volume 9 | Issue 11 | e112524 Report Documentation Page Form ApprovedOMB No. 0704... Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 FBA algorithm to incorporate additional biological information from gene expression profiles is...We set the maximization of biomass production as the objective of FBA and implemented it in two different forms : without flux minimization (or

  13. Legal principles of confidentiality and other public interests: Part 1.

    PubMed

    Fullbrook, Suzanne

    The principles of confidentiality are of paramount importance to nurses and all health professionals. This is explicitly so as the Common Law recognizes confidentiality and supports its importance. However, current practice must take cognizance of the realities of 21st century healthcare delivery - we live in an age of electronic data that is potentially very difficult to keep secret. New rules, protocols and guidelines are being formulated, and regulatory bodies such as the Nursing and Midwifery Council (NMC) reflect such rules in their codes of professional conduct. There is, however, a debate that is suggesting that the rules that relate to confidentiality may need to expand or even bend a little as innovate ways of obtaining, storing, utilizing and communicating data continue to occupy the minds of government and those who formulate legal principles (British Medical Association, 2005). This series of three articles will explore these issues. The first part is a review of case law that explores the fundamental legal principles that underpin confidentiality. The second will concentrate on a review of the guidelines that are to be found in professional regulatory documentation - the NMC and the General Medical Council as they relate to the legal principles. The third and last part will review and reflect on issues that relate expressly to the implementation of electronic patient records, with a review of appropriate statutory legislation and principles of common law.

  14. Design principles and developmental mechanisms underlying retinal mosaics.

    PubMed

    Reese, Benjamin E; Keeley, Patrick W

    2015-08-01

    Most structures within the central nervous system (CNS) are composed of different types of neuron that vary in both number and morphology, but relatively little is known about the interplay between these two features, i.e. about the population dynamics of a given cell type. How such arrays of neurons are distributed within a structure, and how they differentiate their dendrites relative to each other, are issues that have recently drawn attention in the invertebrate nervous system, where the genetic and molecular underpinnings of these organizing principles are being revealed in exquisite detail. The retina is one of the few locations where these principles have been extensively studied in the vertebrate CNS, indeed, where the design principles of 'mosaic regularity' and 'uniformity of coverage' were first explicitly defined, quantified, and related to each other. Recent studies have revealed a number of genes that influence the formation of these histotypical features in the retina, including homologues of those invertebrate genes, although close inspection reveals that they do not always mediate comparable developmental processes nor elucidate fundamental design principles. The present review considers just how pervasive these features of 'mosaic regularity' and 'uniform dendritic coverage' are within the mammalian retina, discussing the means by which such features can be assessed in the mature and developing nervous system and examining the limitations associated with those assessments. We then address the extent to which these two design principles co-exist within different populations of neurons, and how they are achieved during development. Finally, we consider the neural phenotypes obtained in mutant nervous systems, to address whether a prospective gene of interest underlies those very design principles. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  15. Quantum theory of the generalised uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bruneton, Jean-Philippe; Larena, Julien

    2017-04-01

    We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.

  16. Green tribology: principles, research areas and challenges.

    PubMed

    Nosonovsky, Michael; Bhushan, Bharat

    2010-10-28

    In this introductory paper for the Theme Issue on green tribology, we discuss the concept of green tribology and its relation to other areas of tribology as well as other 'green' disciplines, namely, green engineering and green chemistry. We formulate the 12 principles of green tribology: the minimization of (i) friction and (ii) wear, (iii) the reduction or complete elimination of lubrication, including self-lubrication, (iv) natural and (v) biodegradable lubrication, (vi) using sustainable chemistry and engineering principles, (vii) biomimetic approaches, (viii) surface texturing, (ix) environmental implications of coatings, (x) real-time monitoring, (xi) design for degradation, and (xii) sustainable energy applications. We further define three areas of green tribology: (i) biomimetics for tribological applications, (ii) environment-friendly lubrication, and (iii) the tribology of renewable-energy application. The integration of these areas remains a primary challenge for this novel area of research. We also discuss the challenges of green tribology and future directions of research.

  17. Storage of RF photons in minimal conditions

    NASA Astrophysics Data System (ADS)

    Cromières, J.-P.; Chanelière, T.

    2018-02-01

    We investigate the minimal conditions to store coherently a RF pulse in a material medium. We choose a commercial quartz as a memory support because it is a widely available component with a high Q-factor. Pulse storage is obtained by varying dynamically the light-matter coupling with an analog switch. This parametric driving of the quartz dynamics can be alternatively interpreted as a stopped-light experiment. We obtain an efficiency of 26%, a storage time of 209 μs and a time-to-bandwidth product of 98 by optimizing the pulse temporal shape. The coherent character of the storage is demonstrated. Our goal is to connect different types of memories in the RF and optical domain for quantum information processing. Our motivation is essentially fundamental.

  18. Simulation of minimally invasive vascular interventions for training purposes.

    PubMed

    Alderliesten, Tanja; Konings, Maurits K; Niessen, Wiro J

    2004-01-01

    To master the skills required to perform minimally invasive vascular interventions, proper training is essential. A computer simulation environment has been developed to provide such training. The simulation is based on an algorithm specifically developed to simulate the motion of a guide wire--the main instrument used during these interventions--in the human vasculature. In this paper, the design and model of the computer simulation environment is described and first results obtained with phantom and patient data are presented. To simulate minimally invasive vascular interventions, a discrete representation of a guide wire is used which allows modeling of guide wires with different physical properties. An algorithm for simulating the propagation of a guide wire within a vascular system, on the basis of the principle of minimization of energy, has been developed. Both longitudinal translation and rotation are incorporated as possibilities for manipulating the guide wire. The simulation is based on quasi-static mechanics. Two types of energy are introduced: internal energy related to the bending of the guide wire, and external energy resulting from the elastic deformation of the vessel wall. A series of experiments were performed on phantom and patient data. Simulation results are qualitatively compared with 3D rotational angiography data. The results indicate plausible behavior of the simulation.

  19. The 2009 Earth Science Literacy Principles

    NASA Astrophysics Data System (ADS)

    Wysession, M. E.; Budd, D. A.; Campbell, K. M.; Conklin, M. H.; Kappel, E. S.; Ladue, N.; Lewis, G.; Raynolds, R.; Ridky, R. W.; Ross, R. M.; Taber, J.; Tewksbury, B. J.; Tuddenham, P.

    2009-12-01

    In 2009, the NSF-funded Earth Science Literacy Initiative (ESLI) completed and published a document representing a community consensus about what all Americans should understand about Earth sciences. These Earth Science Literacy Principles, presented as a printed brochure and on the Internet at www.earthscienceliteracy.org, were created through the work of nearly 1000 geoscientists and geoeducators who helped identify nine “big ideas” and seventy-five “supporting concepts” fundamental to terrestrial geosciences. The content scope involved the geosphere and land-based hydrosphere as addressed by the NSF-EAR program, including the fields of geobiology and low-temperature geochemistry, geomorphology and land-use dynamics, geophysics, hydrologic sciences, petrology and geochemistry, sedimentary geology and paleobiology, and tectonics. The ESLI Principles were designed to complement similar documents from the ocean, atmosphere, and climate research communities, with the long-term goal of combining these separate literacy documents into a single Earth System Science literacy framework. The aim of these principles is to educate the public, shape the future of geoscience education, and help guide the development of government policy related to Earth science. For example, K-12 textbooks are currently being written and museum exhibits constructed with these Principles in hand. NPR-funded educational videos are in the process of being made in alignment with the ESLP Principles. US House and Senate representatives on science and education committees have been made aware that the major geoscience organizations have endorsed such a document generated and supported by the community. Given the importance of Earth science in so many societally relevant topics such as climate change, energy and mineral resources, water availability, natural hazards, agriculture, and human impacts on the biosphere, efforts should be taken to ensure that this document is in a position to

  20. [Bacterial vaginosis and trichomononiasis. Fundamental world guidelines on management and therapy of the patients].

    PubMed

    Gomberg, M A

    2013-01-01

    The vaginal discharge is one of the most frequent symptoms requiring medical advise. Vaginal discharges are mainly associated with three diseases: bacterial vaginosis, trichomononiasis and candidiasis. The review is concerned with up-to-date approaches to the treatment of females with bacterial vaginosis and trichomononiasis, diseases different by the etiology and pathogenesis, but at the same time similar with respect to the treatment. The analysis is in compliance with the principles of the two fundamental world guidelines.

  1. On entropic uncertainty relations in the presence of a minimal length

    NASA Astrophysics Data System (ADS)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  2. Ergonomics and design: its principles applied in the industry.

    PubMed

    Tavares, Ademario Santos; Silva, Francisco Nilson da

    2012-01-01

    Industrial Design encompasses both product development and optimization of production process. In this sense, Ergonomics plays a fundamental role, because its principles, methods and techniques can help operators to carry out their tasks most successfully. A case study carried out in an industry shows that the interaction among Design, Production Engineering and Materials Engineering departments may improve some aspects concerned security, comfort, efficiency and performance. In this process, Ergonomics had shown to be of essential importance to strategic decision making to the improvement of production section.

  3. Plain fundamentals of Fundamental Planes: analytics and algorithms

    NASA Astrophysics Data System (ADS)

    Sheth, Ravi K.; Bernardi, Mariangela

    2012-05-01

    Estimates of the coefficients a and b of the Fundamental Plane relation R∝σa Ib depend on whether one minimizes the scatter in the R direction, or orthogonal to the plane. We provide explicit expressions for a and b (and confidence limits) in terms of the covariances between log R, log σ and log I. Our expressions quantify the origin of the difference between the direct, inverse and orthogonal fit coefficients. They also show how to account for correlated errors, how to quantify the difference between the plane in a magnitude-limited survey and one which is volume limited, how to determine whether a scaling relation will be biased when using an apparent magnitude-limited survey, how to remove this bias and why some forms of the z≈ 0 plane appear to be less affected by selection effects, but that this does not imply that they will remain unaffected at high redshift. Finally, they show why, to a good approximation, the three vectors associated with the plane, one orthogonal to and the other two in it, can all be written as simple combinations of a and b. Essentially, this is a consequence of the fact that the distribution of surface brightness is much broader than that of velocity dispersions, and velocity dispersion and surface brightness are only weakly correlated. Why this should be so for galaxies is a fundamental open question about the physics of early-type galaxy formation. We argue that if luminosity evolution is differential, and sizes and velocity dispersions do not evolve, then this is just an accident: velocity dispersion and surface brightness must have been correlated in the past. On the other hand, if the (lack of) correlation is similar to that at the present time, then differential luminosity evolution must have been accompanied by structural evolution. A model in which the luminosities of low-luminosity galaxies evolve more rapidly than do those of higher luminosity galaxies is able to produce the observed decrease in a (by a factor of 2 at z

  4. Guiding Principles for Student Leadership Development in the Doctor of Pharmacy Program to Assist Administrators and Faculty Members in Implementing or Refining Curricula

    PubMed Central

    Boyle, Cynthia J.; Janke, Kristin K.

    2013-01-01

    Objective. To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Methods. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Results. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Conclusions. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought. PMID:24371345

  5. Some Fundamental Molecular Mechanisms of Contractility in Fibrous Macromolecules

    PubMed Central

    Mandelkern, L.

    1967-01-01

    The fundamental molecular mechanisms of contractility and tension development in fibrous macromolecules are developed from the point of view of the principles of polymer physical chemistry. The problem is treated in a general manner to encompass the behavior of all macromolecular systems irrespective of their detailed chemical structure and particular function, if any. Primary attention is given to the contractile process which accompanies the crystal-liquid transition in axially oriented macromolecular systems. The theoretical nature of the process is discussed, and many experimental examples are given from the literature which demonstrate the expected behavior. Experimental attention is focused on the contraction of fibrous proteins, and the same underlying molecular mechanism is shown to be operative for a variety of different systems. PMID:6050598

  6. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  7. Equivalence principle in chameleon models

    NASA Astrophysics Data System (ADS)

    Kraiselburd, Lucila; Landau, Susana J.; Salgado, Marcelo; Sudarsky, Daniel; Vucetich, Héctor

    2018-05-01

    Most theories that predict time and/or space variation of fundamental constants also predict violations of the weak equivalence principle (WEP). In 2004 Khoury and Weltman [1] proposed the so called chameleon field arguing that it could help avoiding experimental bounds on the WEP while having a nontrivial cosmological impact. In this paper we revisit the extent to which these expectations continue to hold as we enter the regime of high precision tests. The basis of the study is the development of a new method for computing the force between two massive bodies induced by the chameleon field which takes into account the influence on the field by both, the large and the test bodies. We confirm that in the thin shell regime the force does depend nontrivially on the test body's composition, even when the chameleon coupling constants βi=β are universal. We also propose a simple criterion based on energy minimization, that we use to determine which of the approximations used in computing the scalar field in a two body problem is better in each specific regime. As an application of our analysis we then compare the resulting differential acceleration of two test bodies with the corresponding bounds obtained from Eötvös type experiments. We consider two setups: (1) an Earth based experiment where the test bodies are made of Be and Al; (2) the Lunar Laser Ranging experiment. We find that for some choices of the free parameters of the chameleon model the predictions of the Eötvös parameter are larger than some of the previous estimates. As a consequence, we put new constrains on these free parameters. Our conclusions strongly suggest that the properties of immunity from experimental tests of the WEP, usually attributed to the chameleon and related models, should be carefully reconsidered. An important result of our analysis is that our approach leads to new constraints on the parameter space of the chameleon models.

  8. Understanding and applying principles of social cognition and ...

    EPA Pesticide Factsheets

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. Social-ecological stressors place significant pressure on major societal systems, triggering adaptive reforms in human governance and environmental law. Though potentially benefici

  9. Principles of thermoacoustic energy harvesting

    NASA Astrophysics Data System (ADS)

    Avent, A. W.; Bowen, C. R.

    2015-11-01

    Thermoacoustics exploit a temperature gradient to produce powerful acoustic pressure waves. The technology has a key role to play in energy harvesting systems. A time-line in the development of thermoacoustics is presented from its earliest recorded example in glass blowing through to the development of the Sondhauss and Rijke tubes to Stirling engines and pulse-tube cryo-cooling. The review sets the current literature in context, identifies key publications and promising areas of research. The fundamental principles of thermoacoustic phenomena are explained; design challenges and factors influencing efficiency are explored. Thermoacoustic processes involve complex multi-physical coupling and transient, highly non-linear relationships which are computationally expensive to model; appropriate numerical modelling techniques and options for analyses are presented. Potential methods of harvesting the energy in the acoustic waves are also examined.

  10. Slope across the Curriculum: Principles and Standards for School Mathematics and Common Core State Standards

    ERIC Educational Resources Information Center

    Nagle, Courtney; Moore-Russo, Deborah

    2014-01-01

    This article provides an initial comparison of the Principles and Standards for School Mathematics and the Common Core State Standards for Mathematics by examining the fundamental notion of slope. Each set of standards is analyzed using eleven previously identified conceptualizations of slope. Both sets of standards emphasize Functional Property,…

  11. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  12. Principles, Techniques, and Applications of Tissue Microfluidics

    NASA Technical Reports Server (NTRS)

    Wade, Lawrence A.; Kartalov, Emil P.; Shibata, Darryl; Taylor, Clive

    2011-01-01

    The principle of tissue microfluidics and its resultant techniques has been applied to cell analysis. Building microfluidics to suit a particular tissue sample would allow the rapid, reliable, inexpensive, highly parallelized, selective extraction of chosen regions of tissue for purposes of further biochemical analysis. Furthermore, the applicability of the techniques ranges beyond the described pathology application. For example, they would also allow the posing and successful answering of new sets of questions in many areas of fundamental research. The proposed integration of microfluidic techniques and tissue slice samples is called tissue microfluidics because it molds the microfluidic architectures in accordance with each particular structure of each specific tissue sample. Thus, microfluidics can be built around the tissues, following the tissue structure, or alternatively, the microfluidics can be adapted to the specific geometry of particular tissues. By contrast, the traditional approach is that microfluidic devices are structured in accordance with engineering considerations, while the biological components in applied devices are forced to comply with these engineering presets. The proposed principles represent a paradigm shift in microfluidic technology in three important ways: Microfluidic devices are to be directly integrated with, onto, or around tissue samples, in contrast to the conventional method of off-chip sample extraction followed by sample insertion in microfluidic devices. Architectural and operational principles of microfluidic devices are to be subordinated to suit specific tissue structure and needs, in contrast to the conventional method of building devices according to fluidic function alone and without regard to tissue structure. Sample acquisition from tissue is to be performed on-chip and is to be integrated with the diagnostic measurement within the same device, in contrast to the conventional method of off-chip sample prep and

  13. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  14. Generating Pedestrian Trajectories Consistent with the Fundamental Diagram Based on Physiological and Psychological Factors

    PubMed Central

    Narang, Sahil; Best, Andrew; Curtis, Sean; Manocha, Dinesh

    2015-01-01

    Pedestrian crowds often have been modeled as many-particle system including microscopic multi-agent simulators. One of the key challenges is to unearth governing principles that can model pedestrian movement, and use them to reproduce paths and behaviors that are frequently observed in human crowds. To that effect, we present a novel crowd simulation algorithm that generates pedestrian trajectories that exhibit the speed-density relationships expressed by the Fundamental Diagram. Our approach is based on biomechanical principles and psychological factors. The overall formulation results in better utilization of free space by the pedestrians and can be easily combined with well-known multi-agent simulation techniques with little computational overhead. We are able to generate human-like dense crowd behaviors in large indoor and outdoor environments and validate the results with captured real-world crowd trajectories. PMID:25875932

  15. Catalyst design for enhanced sustainability through fundamental surface chemistry.

    PubMed

    Personick, Michelle L; Montemore, Matthew M; Kaxiras, Efthimios; Madix, Robert J; Biener, Juergen; Friend, Cynthia M

    2016-02-28

    Decreasing energy consumption in the production of platform chemicals is necessary to improve the sustainability of the chemical industry, which is the largest consumer of delivered energy. The majority of industrial chemical transformations rely on catalysts, and therefore designing new materials that catalyse the production of important chemicals via more selective and energy-efficient processes is a promising pathway to reducing energy use by the chemical industry. Efficiently designing new catalysts benefits from an integrated approach involving fundamental experimental studies and theoretical modelling in addition to evaluation of materials under working catalytic conditions. In this review, we outline this approach in the context of a particular catalyst-nanoporous gold (npAu)-which is an unsupported, dilute AgAu alloy catalyst that is highly active for the selective oxidative transformation of alcohols. Fundamental surface science studies on Au single crystals and AgAu thin-film alloys in combination with theoretical modelling were used to identify the principles which define the reactivity of npAu and subsequently enabled prediction of new reactive pathways on this material. Specifically, weak van der Waals interactions are key to the selectivity of Au materials, including npAu. We also briefly describe other systems in which this integrated approach was applied. © 2016 The Author(s).

  16. Catalyst design for enhanced sustainability through fundamental surface chemistry

    DOE PAGES

    Personick, Michelle L.; Montemore, Matthew M.; Kaxiras, Efthimios; ...

    2016-01-11

    Decreasing energy consumption in the production of platform chemicals is necessary to improve the sustainability of the chemical industry, which is the largest consumer of delivered energy. The majority of industrial chemical transformations rely on catalysts, and therefore designing new materials that catalyse the production of important chemicals via more selective and energy-efficient processes is a promising pathway to reducing energy use by the chemical industry. Efficiently designing new catalysts benefits from an integrated approach involving fundamental experimental studies and theoretical modelling in addition to evaluation of materials under working catalytic conditions. In this paper, we outline this approach inmore » the context of a particular catalyst—nanoporous gold (npAu)—which is an unsupported, dilute AgAu alloy catalyst that is highly active for the selective oxidative transformation of alcohols. Fundamental surface science studies on Au single crystals and AgAu thin-film alloys in combination with theoretical modelling were used to identify the principles which define the reactivity of npAu and subsequently enabled prediction of new reactive pathways on this material. Specifically, weak van der Waals interactions are key to the selectivity of Au materials, including npAu. Finally, we also briefly describe other systems in which this integrated approach was applied.« less

  17. Moral principles or consumer preferences? Alternative framings of the trolley problem.

    PubMed

    Rai, Tage S; Holyoak, Keith J

    2010-03-01

    We created paired moral dilemmas with minimal contrasts in wording, a research strategy that has been advocated as a way to empirically establish principles operative in a domain-specific moral psychology. However, the candidate "principles" we tested were not derived from work in moral philosophy, but rather from work in the areas of consumer choice and risk perception. Participants were paradoxically less likely to choose an action that sacrifices one life to save others when they were asked to provide more reasons for doing so (Experiment 1), and their willingness to sacrifice lives depended not only on how many lives would be saved, but on the number of lives at risk (Experiment 2). The latter effect was also found in a within-subjects design (Experiment 3). These findings suggest caution in the use of artificial dilemmas as a key testbed for revealing principled bases for moral judgment. Copyright © 2009 Cognitive Science Society, Inc.

  18. Exchange Rates and Fundamentals.

    ERIC Educational Resources Information Center

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  19. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  20. Emerging principles of regulatory evolution.

    PubMed

    Prud'homme, Benjamin; Gompel, Nicolas; Carroll, Sean B

    2007-05-15

    Understanding the genetic and molecular mechanisms governing the evolution of morphology is a major challenge in biology. Because most animals share a conserved repertoire of body-building and -patterning genes, morphological diversity appears to evolve primarily through changes in the deployment of these genes during development. The complex expression patterns of developmentally regulated genes are typically controlled by numerous independent cis-regulatory elements (CREs). It has been proposed that morphological evolution relies predominantly on changes in the architecture of gene regulatory networks and in particular on functional changes within CREs. Here, we discuss recent experimental studies that support this hypothesis and reveal some unanticipated features of how regulatory evolution occurs. From this growing body of evidence, we identify three key operating principles underlying regulatory evolution, that is, how regulatory evolution: (i) uses available genetic components in the form of preexisting and active transcription factors and CREs to generate novelty; (ii) minimizes the penalty to overall fitness by introducing discrete changes in gene expression; and (iii) allows interactions to arise among any transcription factor and downstream CRE. These principles endow regulatory evolution with a vast creative potential that accounts for both relatively modest morphological differences among closely related species and more profound anatomical divergences among groups at higher taxonomical levels.

  1. Principles and ethics in scientific communication in biomedicine.

    PubMed

    Donev, Doncho

    2013-12-01

    To present the basic principles and standards of scientific communication and writing a paper, to indicate the importance of honesty and ethical approach to research and publication of results in scientific journals, as well as the need for continuing education in the principles and ethics in science and publication in biomedicine. An analysis of relevant materials and documents, sources from the internet and published literature and personal experience and observations of the author. In the past more than 20 years there is an increasingly emphasized importance of respecting fundamental principles and standards of scientific communication and ethical approach to research and publication of results in peer review journals. Advances in the scientific community is based on honesty and equity of researchers in conducting and publishing the results of research and to develop guidelines and policies for prevention and punishment of publishing misconduct. Today scientific communication standards and definitions of fraud in science and publishing are generally consistent, but vary considerably policies and approach to ethics education in science, prevention and penal policies for misconduct in research and publication of results in scientific journals. It is necessary to further strengthen the capacity for education and research, and raising awareness about the importance and need for education about the principles of scientific communication, ethics of research and publication of results. The use of various forms of education of the scientific community, in undergraduate teaching and postgraduate master and doctoral studies, in order to create an ethical environment, is one of the most effective ways to prevent the emergence of scientific and publication dishonesty and fraud.

  2. Design for Natural Breast Augmentation: The ICE Principle.

    PubMed

    Mallucci, Patrick; Branford, Olivier Alexandre

    2016-06-01

    The authors' published studies have helped define breast beauty in outlining key parameters that contribute to breast attractiveness. The "ICE" principle puts design into practice. It is a simplified formula for inframammary fold incision planning as part of the process for determining implant selection and placement to reproduce the 45:55 ratio previously described as fundamental to natural breast appearance. The formula is as follows: implant dimensions (I) - capacity of the breast (C) = excess tissue required (E). The aim of this study was to test the accuracy of the ICE principle for producing consistent natural beautiful results in breast augmentation. A prospective analysis of 50 consecutive women undergoing primary breast augmentation by means of an inframammary fold incision with anatomical or round implants was performed. The ICE principle was applied to all cases to determine implant selection, placement, and incision position. Changes in parameters between preoperative and postoperative digital clinical photographs were analyzed. The mean upper pole-to-lower pole ratio changed from 52:48 preoperatively to 45:55 postoperatively (p < 0.0001). Mean nipple angulation was also statistically significantly elevated from 11 degrees to 19 degrees skyward (p ≤ 0.0005). Accuracy of incision placement in the fold was 99.7 percent on the right and 99.6 percent on the left, with a standard error of only 0.2 percent. There was a reduction in variability for all key parameters. The authors have shown using the simple ICE principle for surgical planning in breast augmentation that attractive natural breasts may be achieved consistently and with precision. Therapeutic, IV.

  3. Water System Adaptation to Hydrological Changes: Module 10, Basic Principles of Incorporating Adaptation Science into Hydrologic Planning and Design

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  4. First-principles atomistic Wulff constructions for an equilibrium rutile TiO2 shape modeling

    NASA Astrophysics Data System (ADS)

    Jiang, Fengzhou; Yang, Lei; Zhou, Dali; He, Gang; Zhou, Jiabei; Wang, Fanhou; Chen, Zhi-Gang

    2018-04-01

    Identifying the exposed surfaces of rutile TiO2 crystal is crucial for its industry application and surface engineering. In this study, the shape of the rutile TiO2 was constructed by applying equilibrium thermodynamics of TiO2 crystals via first-principles density functional theory (DFT) and Wulff principles. From the DFT calculations, the surface energies of six low-index stoichiometric facets of TiO2 are determined after the calibrations of crystal structure. And then, combined surface energy calculations and Wulff principles, a geometric model of equilibrium rutile TiO2 is built up, which is coherent with the typical morphology of fully-developed equilibrium TiO2 crystal. This study provides fundamental theoretical guidance for the surface analysis and surface modification of the rutile TiO2-based materials from experimental research to industry manufacturing.

  5. Cyclic density functional theory: A route to the first principles simulation of bending in nanostructures

    NASA Astrophysics Data System (ADS)

    Banerjee, Amartya S.; Suryanarayana, Phanish

    2016-11-01

    We formulate and implement Cyclic Density Functional Theory (Cyclic DFT) - a self-consistent first principles simulation method for nanostructures with cyclic symmetries. Using arguments based on Group Representation Theory, we rigorously demonstrate that the Kohn-Sham eigenvalue problem for such systems can be reduced to a fundamental domain (or cyclic unit cell) augmented with cyclic-Bloch boundary conditions. Analogously, the equations of electrostatics appearing in Kohn-Sham theory can be reduced to the fundamental domain augmented with cyclic boundary conditions. By making use of this symmetry cell reduction, we show that the electronic ground-state energy and the Hellmann-Feynman forces on the atoms can be calculated using quantities defined over the fundamental domain. We develop a symmetry-adapted finite-difference discretization scheme to obtain a fully functional numerical realization of the proposed approach. We verify that our formulation and implementation of Cyclic DFT is both accurate and efficient through selected examples. The connection of cyclic symmetries with uniform bending deformations provides an elegant route to the ab-initio study of bending in nanostructures using Cyclic DFT. As a demonstration of this capability, we simulate the uniform bending of a silicene nanoribbon and obtain its energy-curvature relationship from first principles. A self-consistent ab-initio simulation of this nature is unprecedented and well outside the scope of any other systematic first principles method in existence. Our simulations reveal that the bending stiffness of the silicene nanoribbon is intermediate between that of graphene and molybdenum disulphide - a trend which can be ascribed to the variation in effective thickness of these materials. We describe several future avenues and applications of Cyclic DFT, including its extension to the study of non-uniform bending deformations and its possible use in the study of the nanoscale flexoelectric effect.

  6. Team-Based Learning Practices and Principles in Comparison with Cooperative Learning and Problem-Based Learning

    ERIC Educational Resources Information Center

    Michaelsen, Larry K.; Davidson, Neil; Major, Claire Howell

    2014-01-01

    The authors address three questions: (1) What are the foundational practices of team-based learning (TBL)? (2) What are the fundamental principles underlying TBL's foundational practices? and (3) In what ways are TBL's foundational practices similar to and/or different from the practices employed by problem-based learning (PBL) and…

  7. First principles design of a core bioenergetic transmembrane electron-transfer protein

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goparaju, Geetha; Fry, Bryan A.; Chobot, Sarah E.

    Here we describe the design, Escherichia coli expression and characterization of a simplified, adaptable and functionally transparent single chain 4-α-helix transmembrane protein frame that binds multiple heme and light activatable porphyrins. Such man-made cofactor-binding oxidoreductases, designed from first principles with minimal reference to natural protein sequences, are known as maquettes. This design is an adaptable frame aiming to uncover core engineering principles governing bioenergetic transmembrane electron-transfer function and recapitulate protein archetypes proposed to represent the origins of photosynthesis. This article is part of a Special Issue entitled Biodesign for Bioenergetics — the design and engineering of electronic transfer cofactors, proteinsmore » and protein networks, edited by Ronald L. Koder and J.L. Ross Anderson.« less

  8. The STEP mission - Satellite test of the equivalence principle

    NASA Technical Reports Server (NTRS)

    Atzei, A.; Swanson, P.; Anselmi, A.

    1992-01-01

    The STEP experiment is a joint ESA/NASA mission candidate for selection as the next medium science project in the ESA scientific program. ESA and NASA have undertaken a joint feasibility study of STEP. The principles of STEP and details of the mission are presented and the mission and spacecraft are described. The primary objective of STEP is to measure differences in the rate of fall of test masses of different compositions to one part in 10 exp 17 of the total gravitational acceleration, a factor of 10 exp 8 improvement in sensitivity over previous experiments. STEP constitutes a comparison of gravitational and inertial mass or a test of the weak equivalence principle (WEP). A test of WEP that is six orders of magnitude more accurate than previous tests will reveal whether the underlying structure of the universe is filled with undiscovered small forces, necessitating a fundamental change in our theories of matter on all scales.

  9. [The principle of the energy minimum in ontogeny and the channeling of developmental processes].

    PubMed

    Ozerniuk, N D

    1989-01-01

    The principle of minimum of energy in ontogenesis has been formulated on the basis of data concerning age changes in energetic metabolism, as well as the influence of ecological factors on this process. According to this principle the smallest expenditures of energy are observed in the zone of the most favorable developmental conditions. The minimal level of energetic metabolism at every developmental stage that corresponds to the most stable state of organism is treated as homeostasis and the developmental stability is treated as homeorrhesis. Regulation mechanisms of energetic metabolism during ontogenesis and under the influence of environmental factors are analyzed.

  10. Colonic transit time and pressure based on Bernoulli's principle.

    PubMed

    Uno, Yoshiharu

    2018-01-01

    Variations in the caliber of human large intestinal tract causes changes in pressure and the velocity of its contents, depending on flow volume, gravity, and density, which are all variables of Bernoulli's principle. Therefore, it was hypothesized that constipation and diarrhea can occur due to changes in the colonic transit time (CTT), according to Bernoulli's principle. In addition, it was hypothesized that high amplitude peristaltic contractions (HAPC), which are considered to be involved in defecation in healthy subjects, occur because of cecum pressure based on Bernoulli's principle. A virtual healthy model (VHM), a virtual constipation model and a virtual diarrhea model were set up. For each model, the CTT was decided according to the length of each part of the colon, and then calculating the velocity due to the cecum inflow volume. In the VHM, the pressure change was calculated, then its consistency with HAPC was verified. The CTT changed according to the difference between the cecum inflow volume and the caliber of the intestinal tract, and was inversely proportional to the cecum inflow volume. Compared with VHM, the CTT was prolonged in the virtual constipation model, and shortened in the virtual diarrhea model. The calculated pressure of the VHM and the gradient of the interlocked graph were similar to that of HAPC. The CTT and HAPC can be explained by Bernoulli's principle, and constipation and diarrhea may be fundamentally influenced by flow dynamics.

  11. Arguing against fundamentality

    NASA Astrophysics Data System (ADS)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  12. Fundamentally updating fundamentals.

    PubMed

    Armstrong, Gail; Barton, Amy

    2013-01-01

    Recent educational research indicates that the six competencies of the Quality and Safety Education for Nurses initiative are best introduced in early prelicensure clinical courses. Content specific to quality and safety has traditionally been covered in senior level courses. This article illustrates an effective approach to using quality and safety as an organizing framework for any prelicensure fundamentals of nursing course. Providing prelicensure students a strong foundation in quality and safety in an introductory clinical course facilitates early adoption of quality and safety competencies as core practice values. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Fundamental Physics with Antihydrogen

    NASA Astrophysics Data System (ADS)

    Hangst, J. S.

    Antihydrogen—the antimatter equivalent of the hydrogen atom—is of fundamental interest as a test bed for universal symmetries—such as CPT and the Weak Equivalence Principle for gravitation. Invariance under CPT requires that hydrogen and antihydrogen have the same spectrum. Antimatter is of course intriguing because of the observed baryon asymmetry in the universe—currently unexplained by the Standard Model. At the CERN Antiproton Decelerator (AD) [1], several groups have been working diligently since 1999 to produce, trap, and study the structure and behaviour of the antihydrogen atom. One of the main thrusts of the AD experimental program is to apply precision techniques from atomic physics to the study of antimatter. Such experiments complement the high-energy searches for physics beyond the Standard Model. Antihydrogen is the only atom of antimatter to be produced in the laboratory. This is not so unfortunate, as its matter equivalent, hydrogen, is one of the most well-understood and accurately measured systems in all of physics. It is thus very compelling to undertake experimental examinations of the structure of antihydrogen. As experimental spectroscopy of antihydrogen has yet to begin in earnest, I will give here a brief introduction to some of the ion and atom trap developments necessary for synthesizing and trapping antihydrogen, so that it can be studied.

  14. Examining the Use of First Principles of Instruction by Instructional Designers in a Short-Term, High Volume, Rapid Production of Online K-12 Teacher Professional Development Modules

    ERIC Educational Resources Information Center

    Mendenhall, Anne M.

    2012-01-01

    Merrill (2002a) created a set of fundamental principles of instruction that can lead to effective, efficient, and engaging (e[superscript 3]) instruction. The First Principles of Instruction (Merrill, 2002a) are a prescriptive set of interrelated instructional design practices that consist of activating prior knowledge, using specific portrayals…

  15. Exploiting the Temperature Dependence of Magnetic Susceptibility to Control Convective in Fundamental Studies of Solidification Phenomena

    NASA Technical Reports Server (NTRS)

    Seybert, C.; Evans, J. W.; Leslie, F.; Jones, W. K., Jr.

    2001-01-01

    It is well known that convection is a dominant mass transport mechanism when materials are solidified on Earth's surface. This convection is caused by gradients in density (and therefore gravitational force) that are brought about by gradients in temperature, composition or both. Diffusion of solute is therefore dwarfed by convection and the study of fundamental parameters, such as dendrite tip shape and growth velocity in the absence of convection is nearly impossible. Significant experimental work has therefore been carried out in orbiting laboratories with the intent of minimizing convection by minimizing gravity. One of the best known experiments of this kind is the Isothermal Dendritic Growth Experiment (IDGE), supported by NASA. Naturally such experiments are costly and one objective of the present investigation is to develop an experimental method whereby convection can be halted, in solidification and other experiments, on the Earth's surface. A second objective is to use the method to minimize convection resulting from the residual accelerations suffered by experiments in microgravity.

  16. Applying design principles to fusion reactor configurations for propulsion in space

    NASA Technical Reports Server (NTRS)

    Carpenter, Scott A.; Deveny, Marc E.; Schulze, Norman R.

    1993-01-01

    The application of fusion power to space propulsion requires rethinking the engineering-design solution to controlled-fusion energy. Whereas the unit cost of electricity (COE) drives the engineering-design solution for utility-based fusion reactor configurations; initial mass to low earth orbit (IMLEO), specific jet power (kW(thrust)/kg(engine)), and reusability drive the engineering-design solution for successful application of fusion power to space propulsion. We applied three design principles (DP's) to adapt and optimize three candidate-terrestrial-fusion-reactor configurations for propulsion in space. The three design principles are: provide maximum direct access to space for waste radiation, operate components as passive radiators to minimize cooling-system mass, and optimize the plasma fuel, fuel mix, and temperature for best specific jet power. The three candidate terrestrial fusion reactor configurations are: the thermal barrier tandem mirror (TBTM), field reversed mirror (FRM), and levitated dipole field (LDF). The resulting three candidate space fusion propulsion systems have their IMLEO minimized and their specific jet power and reusability maximized. We performed a preliminary rating of these configurations and concluded that the leading engineering-design solution to space fusion propulsion is a modified TBTM that we call the Mirror Fusion Propulsion System (MFPS).

  17. A New Type of Atom Interferometry for Testing Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Lorek, Dennis; Lämmerzahl, Claus; Wicht, Andreas

    We present a new type of atom interferometer (AI) that provides a tool for ultra-high precision tests of fundamental physics. As an example we present how an AI based on highly charged hydrogen-like atoms is affected by gravitational waves (GW). A qualitative description of the quantum interferometric measurement principle is given, the modifications in the atomic Hamiltonian caused by the GW are presented, and the size of the resulting frequency shifts in hydrogen-like atoms is estimated. For a GW amplitude of h = 10-23 the frequency shift is of the order of 110μHz for an AI based on a 91-fold charged uranium ion. A frequency difference of this size can be resolved by current AIs in 1s.

  18. How fundamental are fundamental constants?

    NASA Astrophysics Data System (ADS)

    Duff, M. J.

    2015-01-01

    I argue that the laws of physics should be independent of one's choice of units or measuring apparatus. This is the case if they are framed in terms of dimensionless numbers such as the fine structure constant, ?. For example, the standard model of particle physics has 19 such dimensionless parameters whose values all observers can agree on, irrespective of what clock, rulers or scales? they use to measure them. Dimensional constants, on the other hand, such as ?, c, G, e and k ?, are merely human constructs whose number and values differ from one choice of units to the next. In this sense, only dimensionless constants are 'fundamental'. Similarly, the possible time variation of dimensionless fundamental 'constants' of nature is operationally well defined and a legitimate subject of physical enquiry. By contrast, the time variation of dimensional constants such as ? or ? on which a good many (in my opinion, confusing) papers have been written, is a unit-dependent phenomenon on which different observers might disagree depending on their apparatus. All these confusions disappear if one asks only unit-independent questions. We provide a selection of opposing opinions in the literature and respond accordingly.

  19. Fermi paradox and alternative strategies for SETI programs - The anthropic principle and the search for close solar analogs

    NASA Astrophysics Data System (ADS)

    Fracassini, Massimo; Pasinetti Fracassini, Laura E.; Pasinetti, Antonio L.

    1988-07-01

    The Anthropic Principle, a new trend of modern cosmology, claims that the origin of life and the development of intelligent beings on the Earth is the result of highly selective biological processes, strictly tuned in the fundamental physical characteristics of the Universe. This principle could account for the failure of some programs of search for extraterrestrial intelligences (SETI) and suggests the search for strict solar analogs as a primary target for SETI strategies. In this connection, the authors have selected 22 solar analogs and discussed their choice.

  20. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  1. The four principles: Can they be measured and do they predict ethical decision making?

    PubMed Central

    2012-01-01

    Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed. PMID:22606995

  2. The four principles: can they be measured and do they predict ethical decision making?

    PubMed

    Page, Katie

    2012-05-20

    The four principles of Beauchamp and Childress--autonomy, non-maleficence, beneficence and justice--have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

  3. The Real and the Mathematical in Quantum Modeling: From Principles to Models and from Models to Principles

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2017-06-01

    The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The

  4. Systems Biology of Skeletal Muscle: Fiber Type as an Organizing Principle

    PubMed Central

    Greising, Sarah M; Gransee, Heather M; Mantilla, Carlos B; Sieck, Gary C

    2012-01-01

    Skeletal muscle force generation and contraction are fundamental to countless aspects of human life. The complexity of skeletal muscle physiology is simplified by fiber type classification where differences are observed from neuromuscular transmission to release of intracellular Ca2+ from the sarcoplasmic reticulum and the resulting recruitment and cycling of cross-bridges. This review uses fiber type classification as an organizing and simplifying principle to explore the complex interactions between the major proteins involved in muscle force generation and contraction. PMID:22811254

  5. Microswitch- and VOCA-Assisted Programs for Two Post-Coma Persons with Minimally Conscious State and Pervasive Motor Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Buonocunto, Francesca; Sacco, Valentina; Colonna, Fabio; Navarro, Jorge; Oliva, Doretta; Signorino, Mario; Megna, Gianfranco

    2009-01-01

    Intervention programs, based on learning principles and assistive technology, were assessed in two studies with two post-coma men with minimally conscious state and pervasive motor disabilities. Study I assessed a program that included (a) an optic microswitch, activated via double blinking, which allowed a man direct access to brief music…

  6. Measurement of fundamental illite particle thicknesses by X-ray diffraction using PVP-10 intercalation

    USGS Publications Warehouse

    Eberl, D.D.; Nüesch, R.; Šucha, Vladimír; Tsipursky, S.

    1998-01-01

    The thicknesses of fundamental illite particles that compose mixed-layer illite-smectite (I-S) crystals can be measured by X-ray diffraction (XRD) peak broadening techniques (Bertaut-Warren-Averbach [BWA] method and integral peak-width method) if the effects of swelling and XRD background noise are eliminated from XRD patterns of the clays. Swelling is eliminated by intercalating Na-saturated I-S with polyvinylpyrrolidone having a molecular weight of 10,000 (PVP-10). Background is minimized by using polished metallic silicon wafers cut perpendicular to (100) as a substrate for XRD specimens, and by using a single-crystal monochromator. XRD measurements of PVP-intercalated diagenetic, hydrothermal and low-grade metamorphic I-S indicate that there are at least 2 types of crystallite thickness distribution shapes for illite fundamental particles, lognormal and asymptotic; that measurements of mean fundamental illite particle thicknesses made by various techniques (Bertant-Warren-Averbach, integral peak width, fixed cation content, and transmission electron microscopy [TEM]) give comparable results; and that strain (small differences in layer thicknesses) generally has a Gaussian distribution in the log-normal-type illites, but is often absent in the asymptotic-type illites.

  7. Fundamentals of microfluidic cell culture in controlled microenvironments†

    PubMed Central

    Young, Edmond W. K.; Beebe, David J.

    2010-01-01

    Microfluidics has the potential to revolutionize the way we approach cell biology research. The dimensions of microfluidic channels are well suited to the physical scale of biological cells, and the many advantages of microfluidics make it an attractive platform for new techniques in biology. One of the key benefits of microfluidics for basic biology is the ability to control parameters of the cell microenvironment at relevant length and time scales. Considerable progress has been made in the design and use of novel microfluidic devices for culturing cells and for subsequent treatment and analysis. With the recent pace of scientific discovery, it is becoming increasingly important to evaluate existing tools and techniques, and to synthesize fundamental concepts that would further improve the efficiency of biological research at the microscale. This tutorial review integrates fundamental principles from cell biology and local microenvironments with cell culture techniques and concepts in microfluidics. Culturing cells in microscale environments requires knowledge of multiple disciplines including physics, biochemistry, and engineering. We discuss basic concepts related to the physical and biochemical microenvironments of the cell, physicochemical properties of that microenvironment, cell culture techniques, and practical knowledge of microfluidic device design and operation. We also discuss the most recent advances in microfluidic cell culture and their implications on the future of the field. The goal is to guide new and interested researchers to the important areas and challenges facing the scientific community as we strive toward full integration of microfluidics with biology. PMID:20179823

  8. Exploring 3D optimal channel networks by multiple organizing principles

    NASA Astrophysics Data System (ADS)

    Mason, Emanuele; Bizzi, Simone; Cominola, Andrea; Castelletti, Andrea; Paik, Kyungrock

    2017-04-01

    Catchment topography and flow networks are shaped by the interactions of water and sediment across various spatial and temporal scales. The complexity of these processes hinders the development of models able to assess the validity of general principles governing such phenomena. The theory of Optimal Channel Networks (OCNs) proved that it is possible to generate drainage networks statistically comparable to those observed in nature by minimizing the energy spent by the water flowing through them. So far, the OCN theory has been developed for planar 2D domains, assuming equal energy expenditure per unit area of channel and, correspondingly, a constant slope-discharge relationship. In this work, we apply the OCN theory to 3D problems by introducing a multi-principle minimization starting from an artificial digital elevation model of pyramidal shape. The OCN theory assumption of constant slope-area relationship is relaxed and embedded into a second-order principle. The modelled 3D channel networks achieve lower total energy expenditure corresponding to 2D sub-optimal OCNs bound to specific slope-area relationships. This is the first time we are able to explore accessible 3D OCNs starting from a general DEM. By contrasting the modelled 3D OCNs and natural river networks, we found statistical similarities of two indexes, namely the area exponent index and the profile concavity index. Among the wide range of alternative and sub-optimal river networks, a minimum degree of 3D network organization is found to guarantee the indexes values within the natural range. These networks simultaneously possess topological and topographic properties of real river networks. We found a pivotal functional link between slope-area relationship and accessible sub-optimal 2D river network paths, which suggests that geological and climate conditions producing slope-area relationships in natural basins co-determine the degree of optimality of accessible network paths.

  9. Better management of multimorbidity: a critical look at the 'Ariadne principles'.

    PubMed

    Bower, Peter

    2014-12-08

    Primary care clinicians and researchers are growing increasingly aware of the prevalence of multimorbidity among long-term conditions, and the impact on patient experience, health, and utilisation of care. The correspondence paper by Muth et al. entitled 'The Ariadne principles: how to handle multimorbidity in primary care consultations' outlines new thinking on a better way to manage the challenges of decision-making in multimorbidity. The paper highlights the importance of shared treatment goals as a fundamental basis for more effective management. Although a welcome contribution to the literature, the principles raise a number of challenges: the complexities of achieving effective patient-centred assessment and goal-setting; how best to encourage implementation of new practices; and the current state of the evidence around multimorbidity and its management.Please see related article: http://www.biomedcentral.com/1741-7015/12/223.

  10. Universal Principles in the Repair of Communication Problems

    PubMed Central

    Dingemanse, Mark; Roberts, Seán G.; Baranova, Julija; Blythe, Joe; Drew, Paul; Floyd, Simeon; Gisladottir, Rosa S.; Kendrick, Kobin H.; Levinson, Stephen C.; Manrique, Elizabeth; Rossi, Giovanni; Enfield, N. J.

    2015-01-01

    There would be little adaptive value in a complex communication system like human language if there were no ways to detect and correct problems. A systematic comparison of conversation in a broad sample of the world’s languages reveals a universal system for the real-time resolution of frequent breakdowns in communication. In a sample of 12 languages of 8 language families of varied typological profiles we find a system of ‘other-initiated repair’, where the recipient of an unclear message can signal trouble and the sender can repair the original message. We find that this system is frequently used (on average about once per 1.4 minutes in any language), and that it has detailed common properties, contrary to assumptions of radical cultural variation. Unrelated languages share the same three functionally distinct types of repair initiator for signalling problems and use them in the same kinds of contexts. People prefer to choose the type that is the most specific possible, a principle that minimizes cost both for the sender being asked to fix the problem and for the dyad as a social unit. Disruption to the conversation is kept to a minimum, with the two-utterance repair sequence being on average no longer that the single utterance which is being fixed. The findings, controlled for historical relationships, situation types and other dependencies, reveal the fundamentally cooperative nature of human communication and offer support for the pragmatic universals hypothesis: while languages may vary in the organization of grammar and meaning, key systems of language use may be largely similar across cultural groups. They also provide a fresh perspective on controversies about the core properties of language, by revealing a common infrastructure for social interaction which may be the universal bedrock upon which linguistic diversity rests. PMID:26375483

  11. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  12. Information physics fundamentals of nanophotonics.

    PubMed

    Naruse, Makoto; Tate, Naoya; Aono, Masashi; Ohtsu, Motoichi

    2013-05-01

    Nanophotonics has been extensively studied with the aim of unveiling and exploiting light-matter interactions that occur at a scale below the diffraction limit of light, and recent progress made in experimental technologies--both in nanomaterial fabrication and characterization--is driving further advancements in the field. From the viewpoint of information, on the other hand, novel architectures, design and analysis principles, and even novel computing paradigms should be considered so that we can fully benefit from the potential of nanophotonics. This paper examines the information physics aspects of nanophotonics. More specifically, we present some fundamental and emergent information properties that stem from optical excitation transfer mediated by optical near-field interactions and the hierarchical properties inherent in optical near-fields. We theoretically and experimentally investigate aspects such as unidirectional signal transfer, energy efficiency and networking effects, among others, and we present their basic theoretical formalisms and describe demonstrations of practical applications. A stochastic analysis of light-assisted material formation is also presented, where an information-based approach provides a deeper understanding of the phenomena involved, such as self-organization. Furthermore, the spatio-temporal dynamics of optical excitation transfer and its inherent stochastic attributes are utilized for solution searching, paving the way to a novel computing paradigm that exploits coherent and dissipative processes in nanophotonics.

  13. Principles and Ethics in Scientific Communication in Biomedicine

    PubMed Central

    Donev, Doncho

    2013-01-01

    Introduction and aim: To present the basic principles and standards of scientific communication and writing a paper, to indicate the importance of honesty and ethical approach to research and publication of results in scientific journals, as well as the need for continuing education in the principles and ethics in science and publication in biomedicine. Methods: An analysis of relevant materials and documents, sources from the internet and published literature and personal experience and observations of the author. Results: In the past more than 20 years there is an increasingly emphasized importance of respecting fundamental principles and standards of scientific communication and ethical approach to research and publication of results in peer review journals. Advances in the scientific community is based on honesty and equity of researchers in conducting and publishing the results of research and to develop guidelines and policies for prevention and punishment of publishing misconduct. Today scientific communication standards and definitions of fraud in science and publishing are generally consistent, but vary considerably policies and approach to ethics education in science, prevention and penal policies for misconduct in research and publication of results in scientific journals. Conclusion: It is necessary to further strengthen the capacity for education and research, and raising awareness about the importance and need for education about the principles of scientific communication, ethics of research and publication of results. The use of various forms of education of the scientific community, in undergraduate teaching and postgraduate master and doctoral studies, in order to create an ethical environment, is one of the most effective ways to prevent the emergence of scientific and publication dishonesty and fraud. PMID:24505166

  14. Fundamentals of metasurface lasers based on resonant dark states

    NASA Astrophysics Data System (ADS)

    Droulias, Sotiris; Jain, Aditya; Koschny, Thomas; Soukoulis, Costas M.

    2017-10-01

    Recently, our group proposed a metamaterial laser design based on explicitly coupled dark resonant states in low-loss dielectrics, which conceptually separates the gain-coupled resonant photonic state responsible for macroscopic stimulated emission from the coupling to specific free-space propagating modes, allowing independent adjustment of the lasing state and its coherent radiation output. Due to this functionality, it is now possible to make lasers that can overcome the trade-off between system dimensions and Q factor, especially for surface emitting lasers with deeply subwavelength thickness. Here, we give a detailed discussion of the key functionality and benefits of this design, such as radiation damping tunability, directionality, subwavelength integration, and simple layer-by-layer fabrication. We examine in detail the fundamental design tradeoffs that establish the principle of operation and must be taken into account and give guidance for realistic implementations.

  15. Higgs varieties and fundamental groups

    NASA Astrophysics Data System (ADS)

    Bruzzo, Ugo; Graña Otero, Beatriz

    2018-06-01

    After reviewing some "fundamental group schemes" that can be attached to a variety by means of Tannaka duality, we consider the example of the Higgs fundamental group scheme, surveying its main properties and relations with the other fundamental groups, and giving some examples.

  16. Modified fundamental Airy wave.

    PubMed

    Seshadri, S R

    2014-01-01

    The propagation characteristics of the fundamental Airy wave are obtained; the intensity distribution is the same as that for a point electric dipole situated at the origin and oriented normal to the propagation direction. The propagation characteristics of the modified fundamental Airy wave are determined. These characteristics are the same as those for the fundamental Gaussian wave provided that an equivalent waist is identified for the Airy wave. In general, the waves are localized spatially with the peak in the propagation direction.

  17. Eugene Wigner and Fundamental Symmetry Principles

    Science.gov Websites

    , DOE Technical Report, April 19, 1944 Effect of the Temperature of the Moderator on the Velocity , 1949 The Magnitude of the Eta Effect, DOE Technical Report, April 25, 1951 Wigner Honored: Eugene

  18. Registration of Aerial Optical Images with LiDAR Data Using the Closest Point Principle and Collinearity Equations.

    PubMed

    Huang, Rongyong; Zheng, Shunyi; Hu, Kun

    2018-06-01

    Registration of large-scale optical images with airborne LiDAR data is the basis of the integration of photogrammetry and LiDAR. However, geometric misalignments still exist between some aerial optical images and airborne LiDAR point clouds. To eliminate such misalignments, we extended a method for registering close-range optical images with terrestrial LiDAR data to a variety of large-scale aerial optical images and airborne LiDAR data. The fundamental principle is to minimize the distances from the photogrammetric matching points to the terrestrial LiDAR data surface. Except for the satisfactory efficiency of about 79 s per 6732 × 8984 image, the experimental results also show that the unit weighted root mean square (RMS) of the image points is able to reach a sub-pixel level (0.45 to 0.62 pixel), and the actual horizontal and vertical accuracy can be greatly improved to a high level of 1/4⁻1/2 (0.17⁻0.27 m) and 1/8⁻1/4 (0.10⁻0.15 m) of the average LiDAR point distance respectively. Finally, the method is proved to be more accurate, feasible, efficient, and practical in variety of large-scale aerial optical image and LiDAR data.

  19. Challenging the principle of proportionality.

    PubMed

    Andersson, Anna-Karin Margareta

    2016-04-01

    The first objective of this article is to examine one aspect of the principle of proportionality (PP) as advanced by Alan Gewirth in his 1978 bookReason and Morality Gewirth claims that being capable of exercising agency to some minimal degree is a property that justifies having at least prima facie rights not to get killed. However, according to the PP, before the being possesses the capacity for exercising agency to that minimal degree, the extent of her rights depends on to what extent she approaches possession of agential capacities. One interpretation of PP holds that variations in degree of possession of the physical constitution necessary to exercise agency are morally relevant. The other interpretation holds that only variations in degree of actual mental capacity are morally relevant. The first of these interpretations is vastly more problematic than the other. The second objective is to argue that according to the most plausible interpretation of the PP, the fetus' level of development before at least the 20th week of pregnancy does not affect the fetus' moral rights status. I then suggest that my argument is not restricted to such fetuses, although extending my argument to more developed fetuses requires caution. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  20. Colonic transit time and pressure based on Bernoulli’s principle

    PubMed Central

    Uno, Yoshiharu

    2018-01-01

    Purpose Variations in the caliber of human large intestinal tract causes changes in pressure and the velocity of its contents, depending on flow volume, gravity, and density, which are all variables of Bernoulli’s principle. Therefore, it was hypothesized that constipation and diarrhea can occur due to changes in the colonic transit time (CTT), according to Bernoulli’s principle. In addition, it was hypothesized that high amplitude peristaltic contractions (HAPC), which are considered to be involved in defecation in healthy subjects, occur because of cecum pressure based on Bernoulli’s principle. Methods A virtual healthy model (VHM), a virtual constipation model and a virtual diarrhea model were set up. For each model, the CTT was decided according to the length of each part of the colon, and then calculating the velocity due to the cecum inflow volume. In the VHM, the pressure change was calculated, then its consistency with HAPC was verified. Results The CTT changed according to the difference between the cecum inflow volume and the caliber of the intestinal tract, and was inversely proportional to the cecum inflow volume. Compared with VHM, the CTT was prolonged in the virtual constipation model, and shortened in the virtual diarrhea model. The calculated pressure of the VHM and the gradient of the interlocked graph were similar to that of HAPC. Conclusion The CTT and HAPC can be explained by Bernoulli’s principle, and constipation and diarrhea may be fundamentally influenced by flow dynamics. PMID:29670388

  1. Lorentz invariance violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag

    2016-01-01

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  2. Synthetic Elucidation of Design Principles for Molecular Qubits

    NASA Astrophysics Data System (ADS)

    Graham, Michael James

    Quantum information processing (QIP) is an emerging computational paradigm with the potential to enable a vast increase in computational power, fundamentally transforming fields from structural biology to finance. QIP employs qubits, or quantum bits, as its fundamental units of information, which can exist in not just the classical states of 0 or 1, but in a superposition of the two. In order to successfully perform QIP, this superposition state must be sufficiently long-lived. One promising paradigm for the implementation of QIP involves employing unpaired electrons in coordination complexes as qubits. This architecture is highly tunable and scalable, however coordination complexes frequently suffer from short superposition lifetimes, or T2. In order to capitalize on the promise of molecular qubits, it is necessary to develop a set of design principles that allow the rational synthesis of complexes with sufficiently long values of T2. In this dissertation, I report efforts to use the synthesis of series of complexes to elucidate design principles for molecular qubits. Chapter 1 details previous work by our group and others in the field. Chapter 2 details the first efforts of our group to determine the impact of varying spin and spin-orbit coupling on T2. Chapter 3 examines the effect of removing nuclear spins on coherence time, and reports a series of vanadyl bis(dithiolene) complexes which exhibit extremely long coherence lifetimes, in excess of the 100 mus threshold for qubit viability. Chapters 4 and 5 form two complimentary halves of a study to determine the exact relationship between electronic spin-nuclear spin distance and the effect of the nuclear spins on T2. Finally, chapter 6 suggests next directions for the field as a whole, including the potential for work in this field to impact the development of other technologies as diverse as quantum sensors and magnetic resonance imaging contrast agents.

  3. Minimally Informative Prior Distributions for PSA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that themore » posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the

  4. 2016 TSRC Summer School on Fundamental Science for Alternative Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batista, Victor S.

    The 2016 TSRC Summer School on Fundamental Science for Alternative Energy introduced principles, methods, and approaches relevant to the design of molecular transformations, energy transduction, and current applications for alternative energy. Energy and environment are likely to be key themes that will dominate the way science and engineering develop over the next few decades. Only an interdisciplinary approach with a team-taught structure as presented at the 2016 TSRC Summer School can be expected to succeed in the face of problems of such difficulty. The course inspired a new generation of 24 graduate students and 2 post-docs to continue work inmore » the field, or at least to have something of an insider's point of view as the field develops in the next few decades.« less

  5. Fundamental approaches in molecular biology for communication sciences and disorders.

    PubMed

    Bartlett, Rebecca S; Jetté, Marie E; King, Suzanne N; Schaser, Allison; Thibeault, Susan L

    2012-08-01

    This contemporary tutorial will introduce general principles of molecular biology, common deoxyribonucleic acid (DNA), ribonucleic acid (RNA), and protein assays and their relevance in the field of communication sciences and disorders. Over the past 2 decades, knowledge of the molecular pathophysiology of human disease has increased at a remarkable pace. Most of this progress can be attributed to concomitant advances in basic molecular biology and, specifically, the development of an ever-expanding armamentarium of technologies for analysis of DNA, RNA, and protein structure and function. Details of these methodologies, their limitations, and examples from the communication sciences and disorders literature are presented. Results/Conclusions The use of molecular biology techniques in the fields of speech, language, and hearing sciences is increasing, facilitating the need for an understanding of molecular biology fundamentals and common experimental assays.

  6. How evolutionary principles improve the understanding of human health and disease.

    PubMed

    Gluckman, Peter D; Low, Felicia M; Buklijas, Tatjana; Hanson, Mark A; Beedle, Alan S

    2011-03-01

    An appreciation of the fundamental principles of evolutionary biology provides new insights into major diseases and enables an integrated understanding of human biology and medicine. However, there is a lack of awareness of their importance amongst physicians, medical researchers, and educators, all of whom tend to focus on the mechanistic (proximate) basis for disease, excluding consideration of evolutionary (ultimate) reasons. The key principles of evolutionary medicine are that selection acts on fitness, not health or longevity; that our evolutionary history does not cause disease, but rather impacts on our risk of disease in particular environments; and that we are now living in novel environments compared to those in which we evolved. We consider these evolutionary principles in conjunction with population genetics and describe several pathways by which evolutionary processes can affect disease risk. These perspectives provide a more cohesive framework for gaining insights into the determinants of health and disease. Coupled with complementary insights offered by advances in genomic, epigenetic, and developmental biology research, evolutionary perspectives offer an important addition to understanding disease. Further, there are a number of aspects of evolutionary medicine that can add considerably to studies in other domains of contemporary evolutionary studies.

  7. How evolutionary principles improve the understanding of human health and disease

    PubMed Central

    Gluckman, Peter D; Low, Felicia M; Buklijas, Tatjana; Hanson, Mark A; Beedle, Alan S

    2011-01-01

    An appreciation of the fundamental principles of evolutionary biology provides new insights into major diseases and enables an integrated understanding of human biology and medicine. However, there is a lack of awareness of their importance amongst physicians, medical researchers, and educators, all of whom tend to focus on the mechanistic (proximate) basis for disease, excluding consideration of evolutionary (ultimate) reasons. The key principles of evolutionary medicine are that selection acts on fitness, not health or longevity; that our evolutionary history does not cause disease, but rather impacts on our risk of disease in particular environments; and that we are now living in novel environments compared to those in which we evolved. We consider these evolutionary principles in conjunction with population genetics and describe several pathways by which evolutionary processes can affect disease risk. These perspectives provide a more cohesive framework for gaining insights into the determinants of health and disease. Coupled with complementary insights offered by advances in genomic, epigenetic, and developmental biology research, evolutionary perspectives offer an important addition to understanding disease. Further, there are a number of aspects of evolutionary medicine that can add considerably to studies in other domains of contemporary evolutionary studies. PMID:25567971

  8. Survey of minimally invasive general surgery fellows training in robotic surgery.

    PubMed

    Shaligram, Abhijit; Meyer, Avishai; Simorov, Anton; Pallati, Pradeep; Oleynikov, Dmitry

    2013-06-01

    Minimally invasive surgery fellowships offer experience in robotic surgery, the nature of which is poorly defined. The objective of this survey was to determine the current status and opportunities for robotic surgery training available to fellows training in the United States and Canada. Sixty-five minimally invasive surgery fellows, attending a fundamentals of fellowship conference, were asked to complete a questionnaire regarding their demographics and experiences with robotic surgery and training. Fifty-one of the surveyed fellows completed the questionnaire (83 % response). Seventy-two percent of respondents had staff surgeons trained in performing robotic procedures, with 55 % of respondents having general surgery procedures performed robotically at their institution. Just over half (53 %) had access to a simulation facility for robotic training. Thirty-three percent offered mechanisms for certification and 11 % offered fellowships in robotic surgery. One-third of the minimally invasive surgery fellows felt they had been trained in robotic surgery and would consider making it part of their practice after fellowship. However, most (80 %) had no plans to pursue robotic surgery fellowships. Although a large group (63 %) felt optimistic about the future of robotic surgery, most respondents (72.5 %) felt their current experience with robotic surgery training was poor or below average. There is wide variation in exposure to and training in robotic surgery in minimally invasive surgery fellowship programs in the United States and Canada. Although a third of trainees felt adequately trained for performing robotic procedures, most fellows felt that their current experience with training was not adequate.

  9. Information Conservation is Fundamental: Recovering the Lost Information in Hawking Radiation

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Cai, Qing-Yu; Zhan, Ming-Sheng; You, Li

    2013-06-01

    In both classical and quantum world, information cannot appear or disappear. This fundamental principle, however, is questioned for a black hole, by the acclaimed "information loss paradox." Based on the conservation laws of energy, charge, and angular momentum, we recently show the total information encoded in the correlations among Hawking radiations equals exactly to the same amount previously considered lost, assuming the nonthermal spectrum of Parikh and Wilczek. Thus the information loss paradox can be falsified through experiments by detecting correlations, for instance, through measuring the covariances of Hawking radiations from black holes, such as the manmade ones speculated to appear in LHC experiments. The affirmation of information conservation in Hawking radiation will shine new light on the unification of gravity with quantum mechanics.

  10. First-principles investigations of iron-based alloys and their properties

    NASA Astrophysics Data System (ADS)

    Limmer, Krista Renee

    Fundamental understanding of the complex interactions governing structure-property relationships in iron-based alloys is necessary to advance ferrous metallurgy. Two key components of alloy design are carbide formation and stabilization and controlling the active deformation mechanism. Following a first-principles methodology, understanding on the electronic level of these components has been gained for predictive modeling of alloys. Transition metal carbides have long played an important role in alloy design, though the complexity of their interactions with the ferrous matrix is not well understood. Bulk, surface, and interface properties of vanadium carbide, VCx, were calculated to provide insight for the carbide formation and stability. Carbon vacancy defects are shown to stabilize the bulk carbide due to increased V-V bonding in addition to localized increased V-C bond strength. The VCx (100) surface energy is minimized when carbon vacancies are at least two layers from the surface. Further, the Fe/VC interface is stabilized through maintaining stoichiometry at the Fe/VC interface. Intrinsic and unstable stacking fault energy, gammaisf and gamma usf respectively, were explicitly calculated in nonmagnetic fcc Fe-X systems for X = Al, Si, P, S, and the 3d and 4d transition elements. A parabolic relationship is observed in gamma isf across the transition metals with minimums observed for Mn and Tc in the 3d and 4d periods, respectively. Mn is the only alloying addition that was shown to decrease gamma isf in fcc Fe at the given concentration. The effect of alloying on gammausf also has a parabolic relationship, with all additions decreasing gammaisf yielding maximums for Fe and Rh.

  11. [Sampling in qualitative research: basic principles and some controversies].

    PubMed

    Martínez-Salgado, Carolina

    2012-03-01

    This paper presents the rationale for the choice of participants in qualitative research in contrast with that of probability sampling principles in epidemiological research. For a better understanding of the differences, concepts of nomothetic and ideographic generalizability, as well as those of transferability and reflexivity, are proposed, Fundamentals of the main types of sampling commonly used in qualitative research, and the meaning of the concept of saturation are mentioned. Finally, some reflections on the controversies that have arisen in recent years on various paradigmatic perspectives from which to conduct qualitative research, their possibilities of combination with epidemiological research, and some implications for the study of health issues are presented.

  12. Extrema principles of entrophy production and energy dissipation in fluid mechanics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Karamcheti, Krishnamurty

    1988-01-01

    A survey is presented of several extrema principles of energy dissipation as applied to problems in fluid mechanics. An exact equation is derived for the dissipation function of a homogeneous, isotropic, Newtonian fluid, with terms associated with irreversible compression or expansion, wave radiation, and the square of the vorticity. By using entropy extrema principles, simple flows such as the incompressible channel flow and the cylindrical vortex are identified as minimal dissipative distributions. The principal notions of stability of parallel shear flows appears to be associated with a maximum dissipation condition. These different conditions are consistent with Prigogine's classification of thermodynamic states into categories of equilibrium, linear nonequilibrium, and nonlinear nonequilibrium thermodynamics; vortices and acoustic waves appear as examples of dissipative structures. The measurements of a typical periodic shear flow, the rectangular wall jet, show that direct measurements of the dissipative terms are possible.

  13. Microwave remote sensing: Active and passive. Volume 1 - Microwave remote sensing fundamentals and radiometry

    NASA Technical Reports Server (NTRS)

    Ulaby, F. T.; Moore, R. K.; Fung, A. K.

    1981-01-01

    The three components of microwave remote sensing (sensor-scene interaction, sensor design, and measurement techniques), and the applications to geoscience are examined. The history of active and passive microwave sensing is reviewed, along with fundamental principles of electromagnetic wave propagation, antennas, and microwave interaction with atmospheric constituents. Radiometric concepts are reviewed, particularly for measurement problems for atmospheric and terrestrial sources of natural radiation. Particular attention is given to the emission by atmospheric gases, clouds, and rain as described by the radiative transfer function. Finally, the operation and performance characteristics of radiometer receivers are discussed, particularly for measurement precision, calibration techniques, and imaging considerations.

  14. Fundamentals of metasurface lasers based on resonant dark states

    DOE PAGES

    Droulias, Sotiris; Jain, Aditya; Koschny, Thomas; ...

    2017-10-30

    Recently, our group proposed a metamaterial laser design based on explicitly coupled dark resonant states in low-loss dielectrics, which conceptually separates the gain-coupled resonant photonic state responsible for macroscopic stimulated emission from the coupling to specific free-space propagating modes, allowing independent adjustment of the lasing state and its coherent radiation output. Due to this functionality, it is now possible to make lasers that can overcome the trade-off between system dimensions and Q factor, especially for surface emitting lasers with deeply subwavelength thickness. In this paper, we give a detailed discussion of the key functionality and benefits of this design, suchmore » as radiation damping tunability, directionality, subwavelength integration, and simple layer-by-layer fabrication. Finally, we examine in detail the fundamental design tradeoffs that establish the principle of operation and must be taken into account and give guidance for realistic implementations.« less

  15. 41 CFR 102-76.55 - What sustainable development principles must Federal agencies apply to the siting, design, and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... construction of new facilities, which include— (a) Optimizing site potential; (b) Minimizing non-renewable... development principles must Federal agencies apply to the siting, design, and construction of new facilities... Federal agencies apply to the siting, design, and construction of new facilities? In keeping with the...

  16. 41 CFR 102-76.55 - What sustainable development principles must Federal agencies apply to the siting, design, and...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... construction of new facilities, which include— (a) Optimizing site potential; (b) Minimizing non-renewable... development principles must Federal agencies apply to the siting, design, and construction of new facilities... Federal agencies apply to the siting, design, and construction of new facilities? In keeping with the...

  17. 41 CFR 102-76.55 - What sustainable development principles must Federal agencies apply to the siting, design, and...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... construction of new facilities, which include— (a) Optimizing site potential; (b) Minimizing non-renewable... development principles must Federal agencies apply to the siting, design, and construction of new facilities... Federal agencies apply to the siting, design, and construction of new facilities? In keeping with the...

  18. 41 CFR 102-76.55 - What sustainable development principles must Federal agencies apply to the siting, design, and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... construction of new facilities, which include— (a) Optimizing site potential; (b) Minimizing non-renewable... development principles must Federal agencies apply to the siting, design, and construction of new facilities... Federal agencies apply to the siting, design, and construction of new facilities? In keeping with the...

  19. Equivalence of the Kelvin-Planck statement of the second law and the principle of entropy increase

    NASA Astrophysics Data System (ADS)

    Sarasua, L. G.; Abal, G.

    2016-09-01

    We present a demonstration of the equivalence between the Kelvin-Planck statement of the second law and the principle of entropy increase. Despite the fundamental importance of these two statements, a rigorous treatment to establish their equivalence is missing in standard physics textbooks. The argument is valid under very general conditions, but is simple and suited to an undergraduate course.

  20. Efficient Variational Quantum Simulator Incorporating Active Error Minimization

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2017-04-01

    One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.

  1. Gaming the System: Developing an Educational Game for Securing Principles of Arterial Blood Gases.

    PubMed

    Boyd, Cory Ann; Warren, Jonah; Glendon, Mary Ann

    2016-01-01

    This article describes the development process for creating a digital educational mini game prototype designed to provide practice opportunities for learning fundamental principles of arterial blood gases. Mini games generally take less than an hour to play and focus on specific subject matter. An interdisciplinary team of faculty from two universities mentored student game developers to design a digital educational mini game prototype. Sixteen accelerated bachelor of science in nursing students collaborated with game development students and playtested the game prototype during the last semester of their senior year in nursing school. Playtesting is a form of feedback that supports an iterative design process that is critical to game development. A 10-question survey was coupled with group discussions addressing five broad themes of an archetypical digital educational mini game to yield feedback on game design, play, and content. Four rounds of playtesting and incorporating feedback supported the iterative process. Accelerated bachelor of science in nursing student playtester feedback suggests that the digital educational mini game prototype has potential for offering an engaging, playful game experience that will support securing the fundamental principles of arterial blood gases. Next steps are to test the digital educational mini game for teaching and learning effectiveness. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. CODATA Fundamental Physical Constants

    National Institute of Standards and Technology Data Gateway

    SRD 121 NIST CODATA Fundamental Physical Constants (Web, free access)   This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.

  3. Modification of Schrödinger-Newton equation due to braneworld models with minimal length

    NASA Astrophysics Data System (ADS)

    Bhat, Anha; Dey, Sanjib; Faizal, Mir; Hou, Chenguang; Zhao, Qin

    2017-07-01

    We study the correction of the energy spectrum of a gravitational quantum well due to the combined effect of the braneworld model with infinite extra dimensions and generalized uncertainty principle. The correction terms arise from a natural deformation of a semiclassical theory of quantum gravity governed by the Schrödinger-Newton equation based on a minimal length framework. The two fold correction in the energy yields new values of the spectrum, which are closer to the values obtained in the GRANIT experiment. This raises the possibility that the combined theory of the semiclassical quantum gravity and the generalized uncertainty principle may provide an intermediate theory between the semiclassical and the full theory of quantum gravity. We also prepare a schematic experimental set-up which may guide to the understanding of the phenomena in the laboratory.

  4. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  5. Myosin-II controls cellular branching morphogenesis and migration in 3D by minimizing cell surface curvature

    PubMed Central

    Elliott, Hunter; Fischer, Robert A.; Myers, Kenneth A.; Desai, Ravi A.; Gao, Lin; Chen, Christopher S.; Adelstein, Robert; Waterman, Clare M.; Danuser, Gaudenz

    2014-01-01

    In many cases cell function is intimately linked to cell shape control. We utilized endothelial cell branching morphogenesis as a model to understand the role of myosin-II in shape control of invasive cells migrating in 3D collagen gels. We applied principles of differential geometry and mathematical morphology to 3D image sets to parameterize cell branch structure and local cell surface curvature. We find that Rho/ROCK-stimulated myosin-II contractility minimizes cell-scale branching by recognizing and minimizing local cell surface curvature. Utilizing micro-fabrication to constrain cell shape identifies a positive feedback mechanism in which low curvature stabilizes myosin-II cortical association, where it acts to maintain minimal curvature. The feedback between myosin-II regulation by and control of curvature drives cycles of localized cortical myosin-II assembly and disassembly. These cycles in turn mediate alternating phases of directionally biased branch initiation and retraction to guide 3D cell migration. PMID:25621949

  6. International consensus principles for ethical wildlife control.

    PubMed

    Dubois, Sara; Fenwick, Nicole; Ryan, Erin A; Baker, Liv; Baker, Sandra E; Beausoleil, Ngaio J; Carter, Scott; Cartwright, Barbara; Costa, Federico; Draper, Chris; Griffin, John; Grogan, Adam; Howald, Gregg; Jones, Bidda; Littin, Kate E; Lombard, Amanda T; Mellor, David J; Ramp, Daniel; Schuppli, Catherine A; Fraser, David

    2017-08-01

    Human-wildlife conflicts are commonly addressed by excluding, relocating, or lethally controlling animals with the goal of preserving public health and safety, protecting property, or conserving other valued wildlife. However, declining wildlife populations, a lack of efficacy of control methods in achieving desired outcomes, and changes in how people value animals have triggered widespread acknowledgment of the need for ethical and evidence-based approaches to managing such conflicts. We explored international perspectives on and experiences with human-wildlife conflicts to develop principles for ethical wildlife control. A diverse panel of 20 experts convened at a 2-day workshop and developed the principles through a facilitated engagement process and discussion. They determined that efforts to control wildlife should begin wherever possible by altering the human practices that cause human-wildlife conflict and by developing a culture of coexistence; be justified by evidence that significant harms are being caused to people, property, livelihoods, ecosystems, and/or other animals; have measurable outcome-based objectives that are clear, achievable, monitored, and adaptive; predictably minimize animal welfare harms to the fewest number of animals; be informed by community values as well as scientific, technical, and practical information; be integrated into plans for systematic long-term management; and be based on the specifics of the situation rather than negative labels (pest, overabundant) applied to the target species. We recommend that these principles guide development of international, national, and local standards and control decisions and implementation. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  7. Outcome of a graduated minimally invasive facial reanimation in patients with facial paralysis.

    PubMed

    Holtmann, Laura C; Eckstein, Anja; Stähr, Kerstin; Xing, Minzhi; Lang, Stephan; Mattheis, Stefan

    2017-08-01

    Peripheral paralysis of the facial nerve is the most frequent of all cranial nerve disorders. Despite advances in facial surgery, the functional and aesthetic reconstruction of a paralyzed face remains a challenge. Graduated minimally invasive facial reanimation is based on a modular principle. According to the patients' needs, precondition, and expectations, the following modules can be performed: temporalis muscle transposition and facelift, nasal valve suspension, endoscopic brow lift, and eyelid reconstruction. Applying a concept of a graduated minimally invasive facial reanimation may help minimize surgical trauma and reduce morbidity. Twenty patients underwent a graduated minimally invasive facial reanimation. A retrospective chart review was performed with a follow-up examination between 1 and 8 months after surgery. The FACEgram software was used to calculate pre- and postoperative eyelid closure, the level of brows, nasal, and philtral symmetry as well as oral commissure position at rest and oral commissure excursion with smile. As a patient-oriented outcome parameter, the Glasgow Benefit Inventory questionnaire was applied. There was a statistically significant improvement in the postoperative score of eyelid closure, brow asymmetry, nasal asymmetry, philtral asymmetry as well as oral commissure symmetry at rest (p < 0.05). Smile evaluation revealed no significant change of oral commissure excursion. The mean Glasgow Benefit Inventory score indicated substantial improvement in patients' overall quality of life. If a primary facial nerve repair or microneurovascular tissue transfer cannot be applied, graduated minimally invasive facial reanimation is a promising option to restore facial function and symmetry at rest.

  8. Infusing fundamental cause theory with features of Pierre Bourdieu's theory of symbolic power.

    PubMed

    Veenstra, Gerry

    2018-02-01

    The theory of fundamental causes is one of the more influential attempts to provide a theoretical infrastructure for the strong associations between indicators of socioeconomic status (education, income, occupation) and health. It maintains that people of higher socioeconomic status have greater access to flexible resources such as money, knowledge, prestige, power, and beneficial social connections that they can use to reduce their risks of morbidity and mortality and minimize the consequences of disease once it occurs. However, several key aspects of the theory remain underspecified, compromising its ability to provide truly compelling explanations for socioeconomic health inequalities. In particular, socioeconomic status is an assembly of indicators that do not necessarily cohere in a straightforward way, the flexible resources that disproportionately accrue to higher status people are not clearly defined, and the distinction between socioeconomic status and resources is ambiguous. I attempt to address these definitional issues by infusing fundamental cause theory with features of a well-known theory of socioeconomic stratification in the sociological literature-Pierre Bourdieu's theory of symbolic power.

  9. An Independent Asteroseismic Analysis of the Fundamental Parameters and Internal Structure of the Solar-like Oscillator KIC 6225718

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Li, Yan

    2017-09-01

    Asteroseismology is a useful tool that is usually used to probe stellar interiors and to determine stellar fundamental parameters, such as stellar mass, radius, and surface gravity. In order to probe stellar interiors, making comparisons between observations and models is usually used with the {χ }2-minimization method. The work of Wu & Li reported that the best parameter determined by the {χ }2-matching process is the acoustic radius for pure p-mode oscillations. In the present work, based on the theoretical calculations of Wu & Li, we will independently analyze the seismic observations of KIC 6225718 to determine its fundamental parameters and to investigate its interior properties. First, in order to test the method, we use it in the Sun to determine its fundamental parameters and to investigate interiors. Second, we independently determine the fundamental parameters of KIC 6225718 without any other non-seismic constraint. Therefore, those determined fundamental parameters are independent of those determined by other methods. They can be regarded as independent references in other analyses. Finally, we analyze the stellar internal structure and find that KIC 6225718 has a convective core with the size of 0.078-0.092 {R}⊙ . Its overshooting parameter {f}{ov} in the core is around 0.010. In addition, its center hydrogen {X}{{c}} is about 0.264-0.355.

  10. Equivocating on the polluter-pays principle: The consequences for Pakistan.

    PubMed

    Luken, Ralph A

    2009-08-01

    The polluter-pays principle has been widely implemented in OECD countries and credited for bring about a significant reduction in pollutant discharge. However, it has had only limited implementation in developing countries. The consequences of not implementing it in developing countries, to the extent they are documented, are limited to estimating the economic damages of environmental degradation. Yet there are several other but seldom documented negative consequences of the failure to implement the polluter-pays principle. These consequences are documented in the case of Pakistan. They include limited construction of effluent treatment plants, heavy dependence on the government and international donors for funding the only two operational common effluent treatment plants, significant operational issues at the two common effluent treatment plants, missed opportunities to build cost-effective common effluent treatment plants and minimal environmental improvements from isolated investments in individual effluent treatment plants in addition to the already documented significant level of environmental degradation due to uncontrolled pollutant discharge.

  11. Defining the fundamentals of care.

    PubMed

    Kitson, Alison; Conroy, Tiffany; Wengstrom, Yvonne; Profetto-McGrath, Joanne; Robertson-Malt, Suzi

    2010-08-01

    A three-stage process is being undertaken to investigate the fundamentals of care. Stage One (reported here) involves the use of a met a-narrative review methodology to undertake a thematic analysis, categorization and synthesis of selected contents extracted from seminal texts relating to nursing practice. Stage Two will involve a search for evidence to inform the fundamentals of care and a refinement of the review method. Stage Three will extend the reviews of the elements defined as fundamentals of care. This introductory paper covers the following aspects: the conceptual basis upon which nursing care is delivered; how the fundamentals of care have been defined in the literature and in practice; an argument that physiological aspects of care, self-care elements and aspects of the environment of care are central to the conceptual refinement of the term fundamentals of care; and that efforts to systematize such information will enhance overall care delivery through improvements in patient safety and quality initiatives in health systems.

  12. Postpartum sexual health: a principle-based concept analysis.

    PubMed

    O'Malley, Deirdre; Higgins, Agnes; Smith, Valerie

    2015-10-01

    The aim of this study is to report an analysis of the concept of postpartum sexual health. Postpartum sexual health is a minimally understood concept, most often framed within physical/biological dimensions or as a 'checklist' task in postpartum information provision. This has the potential to leave women unprepared to manage transient or normative sexual health changes after childbirth. For meaningful discussions, clarity and understanding of postpartum sexual health is required. A principle-based method of concept analysis. The databases of PubMed, CINAHL, Maternity and Infant Care, PsychInfo, Web of Science, EMBASE, SCOPUS and Social Science Index were systematically searched, from their earliest dates, using a combination of key terms, including; 'sexual health', 'sexual function', 'dyspareunia', 'sexuality', 'sexual desire', 'sexual dysfunction', 'postnatal' and 'postpartum', resulting in a final included dataset of 91 studies. Using the principle-based approach, postpartum sexual health was analysed under the four philosophical principles of epistemological, pragmatic, linguistic and logical. Philosophically, postpartum sexual health is underdeveloped as a concept. A precise theoretical definition remains elusive and, presently, postpartum sexual health cannot be separated theoretically from sexuality and sexual function. Identified antecedents include an instrument free birth, an intact perineum and avoidance of episiotomy. Attributes include sexual arousal, desire, orgasm, sexual satisfaction and resumption of sexual intercourse. Outcomes are sexual satisfaction and a satisfying intimate relationship with one's partner. Postpartum sexual health is conceptually immature with limited applicability in current midwifery practice. © 2015 John Wiley & Sons Ltd.

  13. Inference with minimal Gibbs free energy in information field theory.

    PubMed

    Ensslin, Torsten A; Weig, Cornelius

    2010-11-01

    Non-linear and non-gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a gaussian signal with unknown spectrum, and (iii) inference of a poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-gaussian posterior.

  14. The role of biogeochemical hotspots, landscape heterogeneity, and hydrological connectivity for minimizing forestry effects on water quality.

    PubMed

    Laudon, Hjalmar; Kuglerová, Lenka; Sponseller, Ryan A; Futter, Martyn; Nordin, Annika; Bishop, Kevin; Lundmark, Tomas; Egnell, Gustaf; Ågren, Anneli M

    2016-02-01

    Protecting water quality in forested regions is increasingly important as pressures from land-use, long-range transport of air pollutants, and climate change intensify. Maintaining forest industry without jeopardizing sustainability of surface water quality therefore requires new tools and approaches. Here, we show how forest management can be optimized by incorporating landscape sensitivity and hydrological connectivity into a framework that promotes the protection of water quality. We discuss how this approach can be operationalized into a hydromapping tool to support forestry operations that minimize water quality impacts. We specifically focus on how hydromapping can be used to support three fundamental aspects of land management planning including how to (i) locate areas where different forestry practices can be conducted with minimal water quality impact; (ii) guide the off-road driving of forestry machines to minimize soil damage; and (iii) optimize the design of riparian buffer zones. While this work has a boreal perspective, these concepts and approaches have broad-scale applicability.

  15. Correlates of minimal dating.

    PubMed

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  16. The minimal GUT with inflaton and dark matter unification

    NASA Astrophysics Data System (ADS)

    Chen, Heng-Yu; Gogoladze, Ilia; Hu, Shan; Li, Tianjun; Wu, Lina

    2018-01-01

    Giving up the solutions to the fine-tuning problems, we propose the non-supersymmetric flipped SU(5)× U(1)_X model based on the minimal particle content principle, which can be constructed from the four-dimensional SO(10) models, five-dimensional orbifold SO(10) models, and local F-theory SO(10) models. To achieve gauge coupling unification, we introduce one pair of vector-like fermions, which form a complete SU(5)× U(1)_X representation. The proton lifetime is around 5× 10^{35} years, neutrino masses and mixing can be explained via the seesaw mechanism, baryon asymmetry can be generated via leptogenesis, and the vacuum stability problem can be solved as well. In particular, we propose that inflaton and dark matter particles can be unified to a real scalar field with Z_2 symmetry, which is not an axion and does not have the non-minimal coupling to gravity. Such a kind of scenarios can be applied to the generic scalar dark matter models. Also, we find that the vector-like particle corrections to the B_s^0 masses might be about 6.6%, while their corrections to the K^0 and B_d^0 masses are negligible.

  17. [The beginning of the first principles: the anthropic principle].

    PubMed

    González de Posada, Francisco

    2004-01-01

    The nowadays classical Anthropic Principle is put both in the historical perspective of the traditional problem of "the place of man in the Universe', and in the confluence of several scientific "border" issues, some of which, due to their problematical nature, are also subject of philosophical analysis. On the one hand, the scientific uses of the Principle, related to the initial and constitutional conditions of "our Universe", are enumerated, as they are supposedly necessary for the appearance and consequent development of Life--up to Man--. On the other, an organized collection of the principles of today's Physics is synthetically exhibited. The object of this work is to determine the intrinsic scientific nature of the Anthropic Principle, and the role it plays in the global frame of the principles of Physics (Astrophysics, Astrobiology and Cosmology).

  18. Context Effects in Western Herbal Medicine: Fundamental to Effectiveness?

    PubMed

    Snow, James

    2016-01-01

    Western herbal medicine (WHM) is a complex healthcare system that uses traditional plant-based medicines in patient care. Typical preparations are individualized polyherbal formulae that, unlike herbal pills, retain the odor and taste of whole herbs. Qualitative studies in WHM show patient-practitioner relationships to be collaborative. Health narratives are co-constructed, leading to assessments, and treatments with personal significance for participants. It is hypothesized that the distinct characteristics of traditional herbal preparations and patient-herbalist interactions, in conjunction with the WHM physical healthcare environment, evoke context (placebo) effects that are fundamental to the overall effectiveness of herbal treatment. These context effects may need to be minimized to demonstrate pharmacological efficacy of herbal formulae in randomized, placebo-controlled trials, optimized to demonstrate effectiveness of WHM in pragmatic trials, and consciously harnessed to enhance outcomes in clinical practice. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Phase diagram of two-dimensional hard rods from fundamental mixed measure density functional theory

    NASA Astrophysics Data System (ADS)

    Wittmann, René; Sitta, Christoph E.; Smallenburg, Frank; Löwen, Hartmut

    2017-10-01

    A density functional theory for the bulk phase diagram of two-dimensional orientable hard rods is proposed and tested against Monte Carlo computer simulation data. In detail, an explicit density functional is derived from fundamental mixed measure theory and freely minimized numerically for hard discorectangles. The phase diagram, which involves stable isotropic, nematic, smectic, and crystalline phases, is obtained and shows good agreement with the simulation data. Our functional is valid for a multicomponent mixture of hard particles with arbitrary convex shapes and provides a reliable starting point to explore various inhomogeneous situations of two-dimensional hard rods and their Brownian dynamics.

  20. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  1. Fundamental Structure of Loop Quantum Gravity

    NASA Astrophysics Data System (ADS)

    Han, Muxin; Ma, Yongge; Huang, Weiming

    In the recent twenty years, loop quantum gravity, a background independent approach to unify general relativity and quantum mechanics, has been widely investigated. The aim of loop quantum gravity is to construct a mathematically rigorous, background independent, non-perturbative quantum theory for a Lorentzian gravitational field on a four-dimensional manifold. In the approach, the principles of quantum mechanics are combined with those of general relativity naturally. Such a combination provides us a picture of, so-called, quantum Riemannian geometry, which is discrete on the fundamental scale. Imposing the quantum constraints in analogy from the classical ones, the quantum dynamics of gravity is being studied as one of the most important issues in loop quantum gravity. On the other hand, the semi-classical analysis is being carried out to test the classical limit of the quantum theory. In this review, the fundamental structure of loop quantum gravity is presented pedagogically. Our main aim is to help non-experts to understand the motivations, basic structures, as well as general results. It may also be beneficial to practitioners to gain insights from different perspectives on the theory. We will focus on the theoretical framework itself, rather than its applications, and do our best to write it in modern and precise langauge while keeping the presentation accessible for beginners. After reviewing the classical connection dynamical formalism of general relativity, as a foundation, the construction of the kinematical Ashtekar-Isham-Lewandowski representation is introduced in the content of quantum kinematics. The algebraic structure of quantum kinematics is also discussed. In the content of quantum dynamics, we mainly introduce the construction of a Hamiltonian constraint operator and the master constraint project. At last, some applications and recent advances are outlined. It should be noted that this strategy of quantizing gravity can also be extended to

  2. Living systems do not minimize free energy. Comment on "Answering Schrödinger's question: A free-energy formulation" by Maxwell James Dèsormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Martyushev, Leonid M.

    2018-03-01

    The paper [1] is certainly very useful and important for understanding living systems (e.g. brain) as adaptive, self-organizing patterns. There is no need to enumerate all advantages of the paper, they are obvious. The purpose of my brief comment is to discuss one issue which, as I see it, was not thought out by the authors well enough. As a consequence, their ideas do not find as wide distribution as they otherwise could have found. This issue is related to the name selected for the principle forming the basis of their approach: free-energy principle (FEP). According to the sec. 2.1 [1]: "It asserts that all biological systems maintain their integrity by actively reducing the disorder or dispersion (i.e., entropy) of their sensory and physiological states by minimizing their variational free energy." Let us note that the authors suggested different names for the principle in their earlier works (an objective function, a function of the ensemble density encoded by the organism's configuration and the sensory data to which it is exposed, etc.), and explicitly and correctly mentioned that the free energy and entropy considered by them had nothing in common with the quantities employed in physics [2,3]. It is also obvious that a purely information-theoretic approach used by the authors with regard to the problems under study allows many other wordings and interpretations. However, in spite of this fact, in their last papers as well as in the present paper, the authors choose specifically FEP. Apparently, it may be explained by the intent to additionally base their approach on the foundation of statistical thermodynamics and therefore to demonstrate the universality of the described method. However, this is exactly what might cause misunderstandings specifically among physicists and consequently in their rejection and ignoring of FEP. The physical analogy employed by the authors has the following fundamental inconsistencies: In physics, free energy is used to describe

  3. Corrosion avoidance with new wood preservatives

    Treesearch

    Samuel L. Zelinka; Douglas R. Rammer

    2007-01-01

    This article focuses on considerations that need to be made when choosing products, other than stainless steel, to minimize corrosion of metals in contact with treated wood. With so many ?corrosion-resistant? alternative products on the market, it is important to know the fundamental principles of corrosion protection to make informed decisions when designing...

  4. Determination of real machine-tool settings and minimization of real surface deviation by computerized inspection

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Kuan, Chihping; Zhang, YI

    1991-01-01

    A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.

  5. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in

  6. Inferring the Minimal Genome of Mesoplasma florum by Comparative Genomics and Transposon Mutagenesis.

    PubMed

    Baby, Vincent; Lachance, Jean-Christophe; Gagnon, Jules; Lucier, Jean-François; Matteau, Dominick; Knight, Tom; Rodrigue, Sébastien

    2018-01-01

    The creation and comparison of minimal genomes will help better define the most fundamental mechanisms supporting life. Mesoplasma florum is a near-minimal, fast-growing, nonpathogenic bacterium potentially amenable to genome reduction efforts. In a comparative genomic study of 13 M. florum strains, including 11 newly sequenced genomes, we have identified the core genome and open pangenome of this species. Our results show that all of the strains have approximately 80% of their gene content in common. Of the remaining 20%, 17% of the genes were found in multiple strains and 3% were unique to any given strain. On the basis of random transposon mutagenesis, we also estimated that ~290 out of 720 genes are essential for M. florum L1 in rich medium. We next evaluated different genome reduction scenarios for M. florum L1 by using gene conservation and essentiality data, as well as comparisons with the first working approximation of a minimal organism, Mycoplasma mycoides JCVI-syn3.0. Our results suggest that 409 of the 473 M. mycoides JCVI-syn3.0 genes have orthologs in M. florum L1. Conversely, 57 putatively essential M. florum L1 genes have no homolog in M. mycoides JCVI-syn3.0. This suggests differences in minimal genome compositions, even for these evolutionarily closely related bacteria. IMPORTANCE The last years have witnessed the development of whole-genome cloning and transplantation methods and the complete synthesis of entire chromosomes. Recently, the first minimal cell, Mycoplasma mycoides JCVI-syn3.0, was created. Despite these milestone achievements, several questions remain to be answered. For example, is the composition of minimal genomes virtually identical in phylogenetically related species? On the basis of comparative genomics and transposon mutagenesis, we investigated this question by using an alternative model, Mesoplasma florum, that is also amenable to genome reduction efforts. Our results suggest that the creation of additional minimal

  7. Increasingly minimal bias routing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bataineh, Abdulla; Court, Thomas; Roweth, Duncan

    2017-02-21

    A system and algorithm configured to generate diversity at the traffic source so that packets are uniformly distributed over all of the available paths, but to increase the likelihood of taking a minimal path with each hop the packet takes. This is achieved by configuring routing biases so as to prefer non-minimal paths at the injection point, but increasingly prefer minimal paths as the packet proceeds, referred to herein as Increasing Minimal Bias (IMB).

  8. Variational principle for the Navier-Stokes equations.

    PubMed

    Kerswell, R R

    1999-05-01

    A variational principle is presented for the Navier-Stokes equations in the case of a contained boundary-driven, homogeneous, incompressible, viscous fluid. Based upon making the fluid's total viscous dissipation over a given time interval stationary subject to the constraint of the Navier-Stokes equations, the variational problem looks overconstrained and intractable. However, introducing a nonunique velocity decomposition, u(x,t)=phi(x,t) + nu(x,t), "opens up" the variational problem so that what is presumed a single allowable point over the velocity domain u corresponding to the unique solution of the Navier-Stokes equations becomes a surface with a saddle point over the extended domain (phi,nu). Complementary or dual variational problems can then be constructed to estimate this saddle point value strictly from above as part of a minimization process or below via a maximization procedure. One of these reduced variational principles is the natural and ultimate generalization of the upper bounding problem developed by Doering and Constantin. The other corresponds to the ultimate Busse problem which now acts to lower bound the true dissipation. Crucially, these reduced variational problems require only the solution of a series of linear problems to produce bounds even though their unique intersection is conjectured to correspond to a solution of the nonlinear Navier-Stokes equations.

  9. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  10. Radiotherapy and wound healing: principles, management and prospects (review).

    PubMed

    Gieringer, Matthias; Gosepath, Jan; Naim, Ramin

    2011-08-01

    Radiation therapy is a major therapeutic modality in the management of cancer patients. Over 60% of these patients receive radiotherapy at some point during their course of treatment and over 90% will develop skin reactions after therapy. Problematic wound healing in radiation-damaged tissue constitutes a major surgical difficulty and despite all efforts, irradiated skin remains a therapeutic challenge. This review provides an overview of the fundamental principles of radiation therapy with regards to the wound healing in normal and irradiated skin. Furthermore, it presents techniques that describe how to prevent and manage skin side effects as well as prospects that may improve cutaneous wound repair in general and in irradiated skin.

  11. Individual differences in fundamental social motives.

    PubMed

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Minimally Invasive Dentistry

    MedlinePlus

    ... geta poker friv Home InfoBites Find an AGD Dentist Your Family's Oral Health About the AGD Dental ... structure. It focuses on prevention, remineralization, and minimal dentist intervention. Using scientific advances, minimally invasive dentistry allows ...

  13. Self-organization: the fundament of cell biology.

    PubMed

    Wedlich-Söldner, Roland; Betz, Timo

    2018-05-26

    Self-organization refers to the emergence of an overall order in time and space of a given system that results from the collective interactions of its individual components. This concept has been widely recognized as a core principle in pattern formation for multi-component systems of the physical, chemical and biological world. It can be distinguished from self-assembly by the constant input of energy required to maintain order-and self-organization therefore typically occurs in non-equilibrium or dissipative systems. Cells, with their constant energy consumption and myriads of local interactions between distinct proteins, lipids, carbohydrates and nucleic acids, represent the perfect playground for self-organization. It therefore comes as no surprise that many properties and features of self-organized systems, such as spontaneous formation of patterns, nonlinear coupling of reactions, bi-stable switches, waves and oscillations, are found in all aspects of modern cell biology. Ultimately, self-organization lies at the heart of the robustness and adaptability found in cellular and organismal organization, and hence constitutes a fundamental basis for natural selection and evolution.This article is part of the theme issue 'Self-organization in cell biology'. © 2018 The Author(s).

  14. Self-organization: the fundament of cell biology

    PubMed Central

    Betz, Timo

    2018-01-01

    Self-organization refers to the emergence of an overall order in time and space of a given system that results from the collective interactions of its individual components. This concept has been widely recognized as a core principle in pattern formation for multi-component systems of the physical, chemical and biological world. It can be distinguished from self-assembly by the constant input of energy required to maintain order—and self-organization therefore typically occurs in non-equilibrium or dissipative systems. Cells, with their constant energy consumption and myriads of local interactions between distinct proteins, lipids, carbohydrates and nucleic acids, represent the perfect playground for self-organization. It therefore comes as no surprise that many properties and features of self-organized systems, such as spontaneous formation of patterns, nonlinear coupling of reactions, bi-stable switches, waves and oscillations, are found in all aspects of modern cell biology. Ultimately, self-organization lies at the heart of the robustness and adaptability found in cellular and organismal organization, and hence constitutes a fundamental basis for natural selection and evolution. This article is part of the theme issue ‘Self-organization in cell biology’. PMID:29632257

  15. Nanoparticles for Biomedical Imaging: Fundamentals of Clinical Translation

    PubMed Central

    Choi, Hak Soo; Frangioni, John V.

    2010-01-01

    Because of their large size compared to small molecules, and their multi-functionality, nanoparticles (NPs) hold promise as biomedical imaging, diagnostic, and theragnostic agents. However, the key to their success hinges on a detailed understanding of their behavior after administration into the body. NP biodistribution, target binding, and clearance are a complex function of their physicochemical properties in serum, which include hydrodynamic diameter, solubility, stability, shape and flexibility, surface charge, composition, and formulation. Moreover, many materials used to construct NPs have real or potential toxicity, or may interfere with other medical tests. In this review, we discuss the design considerations that mediate NP behavior in the body and the fundamental principles that govern clinical translation. By analyzing those nanomaterials that have already received regulatory approval, most of which are actually therapeutic agents, we attempt to predict which types of NPs hold potential as diagnostic agents for biomedical imaging. Finally, using quantum dots as an example, we provide a framework for deciding whether an NP-based agent is the best choice for a particular clinical application. PMID:21084027

  16. Understanding and applying principles of social cognition and decision making in adaptive environmental governance.

    PubMed

    DeCaro, Daniel A; Arnol, Craig Anthony Tony; Boama, Emmanuel Frimpong; Garmestani, Ahjond S

    2017-03-01

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people's decision making that cloud their judgment and create conflict. These systems must also satisfy people's fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance.

  17. Understanding and applying principles of social cognition and decision making in adaptive environmental governance

    PubMed Central

    DeCaro, Daniel A.; Arnol, Craig Anthony (Tony); Boama, Emmanuel Frimpong; Garmestani, Ahjond S.

    2018-01-01

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. PMID:29780425

  18. Transatlantic Multispecialty Consensus on Fundamental Endovascular Skills: Results of a Delphi Consensus Study.

    PubMed

    Maertens, H; Aggarwal, R; Macdonald, S; Vermassen, F; Van Herzeele, I

    2016-01-01

    The aim of this study was to establish a consensus on Fundamental Endovascular Skills (FES) for educational purposes and development of training curricula for endovascular procedures. The term "Fundamental Endovascular Skills" is widely used; however, the current literature does not explicitly describe what skills are included in this concept. Endovascular interventions are performed by several specialties that may have opposing perspectives on these skills. A two round Delphi questionnaire approach was used. Experts from interventional cardiology, interventional radiology, and vascular surgery from the United States and Europe were invited to participate. An electronic questionnaire was generated by endovascular therapists with an appropriate educational background but who would not participate in subsequent rounds. The questionnaire consisted of 50 statements describing knowledge, technical, and behavioral skills during endovascular procedures. Experts received the questionnaires by email. They were asked to rate the importance of each skill on a Likert scale from 1 to 5. A statement was considered fundamental when more than 90% of the experts rated it 4 or 5 out of 5. Twenty-three of 53 experts invited agreed to participate: six interventional radiologists (2 USA, 4 Europe), 10 vascular surgeons (4 USA, 6 Europe), and seven interventional cardiologists (4 USA, 3 Europe). There was a 100% response rate in the first round and 87% in the second round. Results showed excellent consensus among responders (Cronbach's alpha = .95 first round; .93 second round). Ninety percent of all proposed skills were considered fundamental. The most critical skills were determined. A transatlantic multispecialty consensus was achieved about the content of "FES" among interventional radiologists, interventional cardiologists, and vascular surgeons from Europe and the United States. These results can serve as directive principles for developing endovascular training curricula

  19. The Steinberg-Bernstein Centre for Minimally Invasive Surgery at McGill University.

    PubMed

    Fried, Gerald M

    2005-12-01

    Surgical skills and simulation centers have been developed in recent years to meet the educational needs of practicing surgeons, residents, and students. The rapid pace of innovation in surgical procedures and technology, as well as the overarching desire to enhance patient safety, have driven the development of simulation technology and new paradigms for surgical education. McGill University has implemented an innovative approach to surgical education in the field of minimally invasive surgery. The goal is to measure surgical performance in the operating room using practical, reliable, and valid metrics, which allow the educational needs of the learner to be established and enable feedback and performance to be tracked over time. The GOALS system and the MISTELS program have been developed to measure operative performance and minimally invasive surgical technical skills in the inanimate skills lab, respectively. The MISTELS laparoscopic simulation-training program has been incorporated as the manual skills education and evaluation component of the Fundamentals of Laparoscopic Surgery program distributed by the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) and the American College of Surgeons.

  20. A review of fundamental principles for animal models of DOHaD research: an Australian perspective.

    PubMed

    Dickinson, H; Moss, T J; Gatford, K L; Moritz, K M; Akison, L; Fullston, T; Hryciw, D H; Maloney, C A; Morris, M J; Wooldridge, A L; Schjenken, J E; Robertson, S A; Waddell, B J; Mark, P J; Wyrwoll, C S; Ellery, S J; Thornburg, K L; Muhlhausler, B S; Morrison, J L

    2016-10-01

    Epidemiology formed the basis of 'the Barker hypothesis', the concept of 'developmental programming' and today's discipline of the Developmental Origins of Health and Disease (DOHaD). Animal experimentation provided proof of the underlying concepts, and continues to generate knowledge of underlying mechanisms. Interventions in humans, based on DOHaD principles, will be informed by experiments in animals. As knowledge in this discipline has accumulated, from studies of humans and other animals, the complexity of interactions between genome, environment and epigenetics, has been revealed. The vast nature of programming stimuli and breadth of effects is becoming known. As a result of our accumulating knowledge we now appreciate the impact of many variables that contribute to programmed outcomes. To guide further animal research in this field, the Australia and New Zealand DOHaD society (ANZ DOHaD) Animals Models of DOHaD Research Working Group convened at the 2nd Annual ANZ DOHaD Congress in Melbourne, Australia in April 2015. This review summarizes the contributions of animal research to the understanding of DOHaD, and makes recommendations for the design and conduct of animal experiments to maximize relevance, reproducibility and translation of knowledge into improving health and well-being.

  1. [Specific features in realization of the principle of minimum energy dissipation during individual development].

    PubMed

    Zotin, A A

    2012-01-01

    Realization of the principle of minimum energy dissipation (Prigogine's theorem) during individual development has been analyzed. This analysis has suggested the following reformulation of this principle for living objects: when environmental conditions are constant, the living system evolves to a current steady state in such a way that the difference between entropy production and entropy flow (psi(u) function) is positive and constantly decreases near the steady state, approaching zero. In turn, the current steady state tends to a final steady state in such a way that the difference between the specific entropy productions in an organism and its environment tends to be minimal. In general, individual development completely agrees with the law of entropy increase (second law of thermodynamics).

  2. [How to be prudent with synthetic biology. Synthetic Biology and the precautionary principle].

    PubMed

    Rodríguez López, Blanca

    2014-01-01

    Synthetic biology is a new discipline that is twofold: firstly it offers the promise to pay benefits that can alleviate some of the ills that plague mankind; On the other hand, like all technologies, holds risks. Given these, the most critical and concerned about the risks, invoke the application of the precautionary principle, common in cases where an activity or new technology creates risks to the environment and/or human health, but far from universally accepted happens to be currently one of the most controversial principles. In this paper the question of the risks and benefits of synthetic biology and the relevance of applying the precautionary principle are analyzed. To do this we proceed as follows. The first part focuses on synthetic biology. At first, this discipline is characterized, with special attention to what is novel compared to the known as "genetic engineering". In the second stage both the benefits and the risks associated with it are discussed. The first part concludes with a review of the efforts currently being made to control or minimize the risks. The second part aims to analyze the precautionary principle and its possible relevance to the case of Synthetic Biology. At first, the different versions and interpretations of the principle and the various criticisms of which has been the subject are reviewed. Finally, after discarding the Precautionary Principle as an useful tool, it is seen as more appropriate some recent proposals to treat technologies that take into account not only risks but also their benefits.

  3. Quantum Bath Refrigeration towards Absolute Zero: Challenging the Unattainability Principle

    NASA Astrophysics Data System (ADS)

    Kolář, M.; Gelbwaser-Klimovsky, D.; Alicki, R.; Kurizki, G.

    2012-08-01

    A minimal model of a quantum refrigerator, i.e., a periodically phase-flipped two-level system permanently coupled to a finite-capacity bath (cold bath) and an infinite heat dump (hot bath), is introduced and used to investigate the cooling of the cold bath towards absolute zero (T=0). Remarkably, the temperature scaling of the cold-bath cooling rate reveals that it does not vanish as T→0 for certain realistic quantized baths, e.g., phonons in strongly disordered media (fractons) or quantized spin waves in ferromagnets (magnons). This result challenges Nernst’s third-law formulation known as the unattainability principle.

  4. Quantum bath refrigeration towards absolute zero: challenging the unattainability principle.

    PubMed

    Kolář, M; Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2012-08-31

    A minimal model of a quantum refrigerator, i.e., a periodically phase-flipped two-level system permanently coupled to a finite-capacity bath (cold bath) and an infinite heat dump (hot bath), is introduced and used to investigate the cooling of the cold bath towards absolute zero (T=0). Remarkably, the temperature scaling of the cold-bath cooling rate reveals that it does not vanish as T→0 for certain realistic quantized baths, e.g., phonons in strongly disordered media (fractons) or quantized spin waves in ferromagnets (magnons). This result challenges Nernst's third-law formulation known as the unattainability principle.

  5. Generalized Uncertainty Principle and Parikh-Wilczek Tunneling

    NASA Astrophysics Data System (ADS)

    Mehdipour, S. Hamid

    We investigate the modifications of the Hawking radiation by the Generalized Uncertainty Principle (GUP) and the tunneling process. By using the GUP-corrected de Broglie wavelength, the squeezing of the fundamental momentum cell, and consequently a GUP-corrected energy, we find the nonthermal effects which lead to a nonzero statistical correlation function between probabilities of tunneling of two massive particles with different energies. Then the recovery of part of the information from the black hole radiation is feasible. From the other point of view, the inclusion of the effects of quantum gravity as the GUP expression can halt the evaporation process, so that a stable black hole remnant is left behind, including the other part of the black hole information content. Therefore, these features of the Planck-scale corrections may solve the information problem in black hole evaporation.

  6. A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano

    In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.

  7. Electrosurgery: principles and practice to reduce risk and maximize efficacy.

    PubMed

    Brill, Andrew I

    2011-12-01

    Science becomes art and art becomes function when fundamental principles are utilized to dictate surgical practice. Most important, the risk for inadvertent thermal injury during electrosurgery can be minimized by a sound comprehension of the predictable behaviors of electricity in living tissue.Guided by the Hippocratic charge of primum non nocere, the ultimate aim of energy-assisted surgery is the attainment of anatomic dissection and hemostasis with the least amount of collateral damage and subsequent scar tissue formation.Ideally, the surgeon’s final view of the operative field should accurately approximate the topography discoverable after postoperative healing. Despite the continued innovation of products borne to reduce thermal damage and then marketed as being comparatively safer, it is the hands and mind of the surgeon that serve to preserve tissue integrity by reducing the burden of delayed thermal necrosis and taking steps to prevent excessive devitalization of tissue. Regardless of the chosen modality, the inseparable and exponentially linked elements of time and the quantity of delivered energy must be integrated while purposefully moderating to attain the desired tissue effect. Ultimately, the reduction of unwanted thermal injury is inherently linked to good surgical judgment and technique, a sound comprehension of the applied energy modality, and the surgeon’s ability to recognize anatomic structures within the field of surgical dissection as well as those within the zone of significant thermal change.During the use of any energy-based device for hemostasis, out of sight must never mean out of mind. If the bowel, bladder, or ureter is in close proximity to a bleeder,they should be sufficiently mobilized before applying energy. Thermal energy should always be withheld until an orderly sequence of anatomic triage is carried out.Whenever a vital structure cannot be adequately mobilized, hemorrhage is preferentially controlled by using mechanical

  8. Redesigning metabolism based on orthogonality principles

    PubMed Central

    Pandit, Aditya Vikram; Srinivasan, Shyam; Mahadevan, Radhakrishnan

    2017-01-01

    Modifications made during metabolic engineering for overproduction of chemicals have network-wide effects on cellular function due to ubiquitous metabolic interactions. These interactions, that make metabolic network structures robust and optimized for cell growth, act to constrain the capability of the cell factory. To overcome these challenges, we explore the idea of an orthogonal network structure that is designed to operate with minimal interaction between chemical production pathways and the components of the network that produce biomass. We show that this orthogonal pathway design approach has significant advantages over contemporary growth-coupled approaches using a case study on succinate production. We find that natural pathways, fundamentally linked to biomass synthesis, are less orthogonal in comparison to synthetic pathways. We suggest that the use of such orthogonal pathways can be highly amenable for dynamic control of metabolism and have other implications for metabolic engineering. PMID:28555623

  9. Atraumatic restorative treatment and minimal intervention dentistry.

    PubMed

    Frencken, J E

    2017-08-11

    Too many people worldwide suffer from the consequences of untreated dentine carious lesions. This finding reflects the inability of the currently used traditional mode of treatments to manage such lesions. A change is needed. Dental training institutions should depart from the traditional 'drill and fill' treatments and embrace the holistic oral healthcare approach that is minimal intervention dentistry (MID) and includes within it minimally invasive operative skills. Dental caries is, after all, a preventable disease. The atraumatic restorative treatment (ART) concept is an example of MID. ART consists of a preventive (ART sealant) and a restorative (ART restoration) component. ART sealants using high-viscosity glass-ionomer (HVGIC) have a very high dentine carious lesion preventive effect. The survival rate of these sealants is not significantly different from that of sealants produced with resin. The survival rate of ART/HVGIC restorations matches those of amalgam and resin composite in single- and multiple-surface cavities in primary teeth and in single-surface cavities in permanent teeth. The principles of carious tissue removal within a cavity recommended by the International Caries Consensus Collaboration are in line with those of treating a cavity using ART. Owing to its good performance and the low levels of discomfort/pain and dental anxiety associated with it, ART and/or other evidence-based atraumatic care procedures should be the first treatment for a primary dentine carious lesion. Only if the use of ART is not indicated should other more invasive and less-atraumatic care procedures be used in both primary and permanent dentitions.

  10. A comprehensive program to minimize platelet outdating.

    PubMed

    Fuller, Alice K; Uglik, Kristin M; Braine, Hayden G; King, Karen E

    2011-07-01

    Platelet (PLT) transfusions are essential for patients who are bleeding or have an increased risk of bleeding due to a decreased number or abnormal function of circulating PLTs. A shelf life of 5 days for PLT products presents an inventory management challenge. In 2006, greater than 10% of apheresis PLTs made in the United States outdated. It is imperative to have a sufficient number of products for patients requiring transfusion, but outdating PLTs is a financial burden and a waste of a resource. We present the approach used in our institution to anticipate inventory needs based on current patient census and usage. Strategies to predict usage and to identify changes in anticipated usage are examined. Annual outdating is reviewed for a 10-year period from 2000 through 2009. From January 1, 2000, through December 2009, there were 128,207 PLT transfusions given to 15,265 patients. The methods used to anticipate usage and adjust inventory resulted in an annual outdate rate of approximately 1% for the 10-year period reviewed. In addition we have not faced situations where inventory was inadequate to meet the needs of the patients requiring transfusions. We have identified three elements of our transfusion service that can minimize outdate: a knowledgeable proactive staff dedicated to PLT management, a comprehensive computer-based transfusion history for each patient, and a strong two-way relationship with the primary product supplier. Through our comprehensive program, based on the principles of providing optimal patient care, we have minimized PLT outdating for more than 10 years. © 2011 American Association of Blood Banks.

  11. Roy's safety-first portfolio principle in financial risk management of disastrous events.

    PubMed

    Chiu, Mei Choi; Wong, Hoi Ying; Li, Duan

    2012-11-01

    Roy pioneers the concept and practice of risk management of disastrous events via his safety-first principle for portfolio selection. More specifically, his safety-first principle advocates an optimal portfolio strategy generated from minimizing the disaster probability, while subject to the budget constraint and the mean constraint that the expected final wealth is not less than a preselected disaster level. This article studies the dynamic safety-first principle in continuous time and its application in asset and liability management. We reveal that the distortion resulting from dropping the mean constraint, as a common practice to approximate the original Roy's setting, either leads to a trivial case or changes the problem nature completely to a target-reaching problem, which produces a highly leveraged trading strategy. Recognizing the ill-posed nature of the corresponding Lagrangian method when retaining the mean constraint, we invoke a wisdom observed from a limited funding-level regulation of pension funds and modify the original safety-first formulation accordingly by imposing an upper bound on the funding level. This model revision enables us to solve completely the safety-first asset-liability problem by a martingale approach and to derive an optimal policy that follows faithfully the spirit of the safety-first principle and demonstrates a prominent nature of fighting for the best and preventing disaster from happening. © 2012 Society for Risk Analysis.

  12. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  13. Trials of Intervention Principles: Evaluation Methods for Evolving Behavioral Intervention Technologies

    PubMed Central

    Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken

    2015-01-01

    In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology. PMID:26155878

  14. Trials of Intervention Principles: Evaluation Methods for Evolving Behavioral Intervention Technologies.

    PubMed

    Mohr, David C; Schueller, Stephen M; Riley, William T; Brown, C Hendricks; Cuijpers, Pim; Duan, Naihua; Kwasny, Mary J; Stiles-Shields, Colleen; Cheung, Ken

    2015-07-08

    In recent years, there has been increasing discussion of the limitations of traditional randomized controlled trial (RCT) methodologies for the evaluation of eHealth and mHealth interventions, and in particular, the requirement that these interventions be locked down during evaluation. Locking down these interventions locks in defects and eliminates the opportunities for quality improvement and adaptation to the changing technological environment, often leading to validation of tools that are outdated by the time that trial results are published. Furthermore, because behavioral intervention technologies change frequently during real-world deployment, even if a tested intervention were deployed in the real world, its shelf life would be limited. We argue that RCTs will have greater scientific and public health value if they focus on the evaluation of intervention principles (rather than a specific locked-down version of the intervention), allowing for ongoing quality improvement modifications to the behavioral intervention technology based on the core intervention principles, while continuously improving the functionality and maintaining technological currency. This paper is an initial proposal of a framework and methodology for the conduct of trials of intervention principles (TIPs) aimed at minimizing the risks of in-trial changes to intervention technologies and maximizing the potential for knowledge acquisition. The focus on evaluation of intervention principles using clinical and usage outcomes has the potential to provide more generalizable and durable information than trials focused on a single intervention technology.

  15. Energy, Metaphysics, and Space: Ernst Mach's Interpretation of Energy Conservation as the Principle of Causality

    NASA Astrophysics Data System (ADS)

    Guzzardi, Luca

    2014-06-01

    This paper discusses Ernst Mach's interpretation of the principle of energy conservation (EC) in the context of the development of energy concepts and ideas about causality in nineteenth-century physics and theory of science. In doing this, it focuses on the close relationship between causality, energy conservation and space in Mach's antireductionist view of science. Mach expounds his thesis about EC in his first historical-epistemological essay, Die Geschichte und die Wurzel des Satzes von der Erhaltung der Arbeit (1872): far from being a new principle, it is used from the early beginnings of mechanics independently from other principles; in fact, EC is a pre-mechanical principle which is generally applied in investigating nature: it is, indeed, nothing but a form of the principle of causality. The paper focuses on the scientific-historical premises and philosophical underpinnings of Mach's thesis, beginning with the classic debate on the validity and limits of the notion of cause by Hume, Kant, and Helmholtz. Such reference also implies a discussion of the relationship between causality on the one hand and space and time on the other. This connection plays a major role for Mach, and in the final paragraphs its importance is argued in order to understand his antireductionist perspective, i.e. the rejection of any attempt to give an ultimate explanation of the world via reduction of nature to one fundamental set of phenomena.

  16. A critique of the principle of 'respect for autonomy', grounded in African thought.

    PubMed

    Behrens, Kevin G

    2018-06-01

    I give an account how the principle of 'respect for autonomy' dominates the field of bioethics, and how it came to triumph over its competitors, 'respect for persons' and 'respect for free power of choice'. I argue that 'respect for autonomy' is unsatisfactory as a basic principle of bioethics because it is grounded in too individualistic a worldview, citing concerns of African theorists and other communitarians who claim that the principle fails to acknowledge the fundamental importance of understanding persons within the nexus of their communal relationships. I defend the claim that 'respect for persons' is a more appropriate principle, as it is able to acknowledge both individual decision making and the essential relationality of persons. I acknowledge that my preference for 'respect for persons' is problematic because of the important debate around the definition of 'personhood' in bioethics discourse. Relying on Thaddeus Metz's conception of moral status, I propose a relational definition of personhood that distinguishes between persons with agency and persons without agency, arguing that we have different moral obligations to these distinct categories of persons. I claim that this conception of personhood is better able to accommodate our moral intuitions than conventional approaches, and that it is able to do so without being speciesist or question-begging. © 2017 John Wiley & Sons Ltd.

  17. TOPICAL REVIEW: First principles studies of multiferroic materials

    NASA Astrophysics Data System (ADS)

    Picozzi, Silvia; Ederer, Claude

    2009-07-01

    Multiferroics, materials where spontaneous long-range magnetic and dipolar orders coexist, represent an attractive class of compounds, which combine rich and fascinating fundamental physics with a technologically appealing potential for applications in the general area of spintronics. Ab initio calculations have significantly contributed to recent progress in this area, by elucidating different mechanisms for multiferroicity and providing essential information on various compounds where these effects are manifestly at play. In particular, here we present examples of density-functional theory investigations for two main classes of materials: (a) multiferroics where ferroelectricity is driven by hybridization or purely structural effects, with BiFeO3 as the prototype material, and (b) multiferroics where ferroelectricity is driven by correlation effects and is strongly linked to electronic degrees of freedom such as spin-, charge-, or orbital-ordering, with rare-earth manganites as prototypes. As for the first class of multiferroics, first principles calculations are shown to provide an accurate qualitative and quantitative description of the physics in BiFeO3, ranging from the prediction of large ferroelectric polarization and weak ferromagnetism, over the effect of epitaxial strain, to the identification of possible scenarios for coupling between ferroelectric and magnetic order. For the second class of multiferroics, ab initio calculations have shown that, in those cases where spin-ordering breaks inversion symmetry (e.g. in antiferromagnetic E-type HoMnO3), the magnetically induced ferroelectric polarization can be as large as a few µC cm-2. The examples presented point the way to several possible avenues for future research: on the technological side, first principles simulations can contribute to a rational materials design, aimed at identifying spintronic materials that exhibit ferromagnetism and ferroelectricity at or above room temperature. On the

  18. Supramolecular chemistry-general principles and selected examples from anion recognition and metallosupramolecular chemistry.

    PubMed

    Albrecht, Markus

    2007-12-01

    This review gives an introduction into supramolecular chemistry describing in the first part general principles, focusing on terms like noncovalent interaction, molecular recognition, self-assembly, and supramolecular function. In the second part those will be illustrated by simple examples from our laboratories. Supramolecular chemistry is the science that bridges the gap between the world of molecules and nanotechnology. In supramolecular chemistry noncovalent interactions occur between molecular building blocks, which by molecular recognition and self-assembly form (functional) supramolecular entities. It is also termed the "chemistry of the noncovalent bond." Molecular recognition is based on geometrical complementarity based on the "key-and-lock" principle with nonshape-dependent effects, e.g., solvatization, being also highly influential. Self-assembly leads to the formation of well-defined aggregates. Hereby the overall structure of the target ensemble is controlled by the symmetry features of the certain building blocks. Finally, the aggregates can possess special properties or supramolecular functions, which are only found in the ensemble but not in the participating molecules. This review gives an introduction on supramolecular chemistry and illustrates the fundamental principles by recent examples from our group.

  19. Testing minimal flavor violation in leptoquark models of the {R_K}{^{(\\ast )}} anomaly

    NASA Astrophysics Data System (ADS)

    Aloni, Daniel; Dery, Avital; Frugiuele, Claudia; Nir, Yosef

    2017-11-01

    The {R_K}{^{(\\ast )}} anomaly can be explained by tree level exchange of leptoquarks. We study the consequences of subjecting these models to the principle of minimal flavor violation (MFV). We consider MFV in the linear regime, and take the charged lepton Yukawa matrix to be the only spurion that violates lepton flavor universality. We find that a combination of constraints from a variety of processes — b → sμμ, b → sττ , b → sνν, b\\overline{b}\\to τ τ and b → cτν — excludes MFV in these models.

  20. Theoretical aspects of the equivalence principle

    NASA Astrophysics Data System (ADS)

    Damour, Thibault

    2012-09-01

    We review several theoretical aspects of the equivalence principle (EP). We emphasize the unsatisfactory fact that the EP maintains the absolute character of the coupling constants of physics, while general relativity and its generalizations (Kaluza-Klein, …, string theory) suggest that all absolute structures should be replaced by dynamical entities. We discuss the EP-violation phenomenology of dilaton-like models, which is likely to be dominated by the linear superposition of two effects: a signal proportional to the nuclear Coulomb energy, related to the variation of the fine-structure constant, and a signal proportional to the surface nuclear binding energy, related to the variation of the light quark masses. We recall various theoretical arguments (including a recently proposed anthropic argument) suggesting that the EP be violated at a small, but not unmeasurably small level. This motivates the need for improved tests of the EP. These tests are probing new territories in physics that are related to deep, and mysterious, issues in fundamental physics.

  1. Intuitions, principles and consequences

    PubMed Central

    Shaw, A

    2001-01-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. Key Words: Intuitions • principles • consequences • utilitarianism PMID:11233371

  2. Transcranial Electrical Neuromodulation Based on the Reciprocity Principle

    PubMed Central

    Fernández-Corazza, Mariano; Turovets, Sergei; Luu, Phan; Anderson, Erik; Tucker, Don

    2016-01-01

    A key challenge in multi-electrode transcranial electrical stimulation (TES) or transcranial direct current stimulation (tDCS) is to find a current injection pattern that delivers the necessary current density at a target and minimizes it in the rest of the head, which is mathematically modeled as an optimization problem. Such an optimization with the Least Squares (LS) or Linearly Constrained Minimum Variance (LCMV) algorithms is generally computationally expensive and requires multiple independent current sources. Based on the reciprocity principle in electroencephalography (EEG) and TES, it could be possible to find the optimal TES patterns quickly whenever the solution of the forward EEG problem is available for a brain region of interest. Here, we investigate the reciprocity principle as a guideline for finding optimal current injection patterns in TES that comply with safety constraints. We define four different trial cortical targets in a detailed seven-tissue finite element head model, and analyze the performance of the reciprocity family of TES methods in terms of electrode density, targeting error, focality, intensity, and directionality using the LS and LCMV solutions as the reference standards. It is found that the reciprocity algorithms show good performance comparable to the LCMV and LS solutions. Comparing the 128 and 256 electrode cases, we found that use of greater electrode density improves focality, directionality, and intensity parameters. The results show that reciprocity principle can be used to quickly determine optimal current injection patterns in TES and help to simplify TES protocols that are consistent with hardware and software availability and with safety constraints. PMID:27303311

  3. Transcranial Electrical Neuromodulation Based on the Reciprocity Principle.

    PubMed

    Fernández-Corazza, Mariano; Turovets, Sergei; Luu, Phan; Anderson, Erik; Tucker, Don

    2016-01-01

    A key challenge in multi-electrode transcranial electrical stimulation (TES) or transcranial direct current stimulation (tDCS) is to find a current injection pattern that delivers the necessary current density at a target and minimizes it in the rest of the head, which is mathematically modeled as an optimization problem. Such an optimization with the Least Squares (LS) or Linearly Constrained Minimum Variance (LCMV) algorithms is generally computationally expensive and requires multiple independent current sources. Based on the reciprocity principle in electroencephalography (EEG) and TES, it could be possible to find the optimal TES patterns quickly whenever the solution of the forward EEG problem is available for a brain region of interest. Here, we investigate the reciprocity principle as a guideline for finding optimal current injection patterns in TES that comply with safety constraints. We define four different trial cortical targets in a detailed seven-tissue finite element head model, and analyze the performance of the reciprocity family of TES methods in terms of electrode density, targeting error, focality, intensity, and directionality using the LS and LCMV solutions as the reference standards. It is found that the reciprocity algorithms show good performance comparable to the LCMV and LS solutions. Comparing the 128 and 256 electrode cases, we found that use of greater electrode density improves focality, directionality, and intensity parameters. The results show that reciprocity principle can be used to quickly determine optimal current injection patterns in TES and help to simplify TES protocols that are consistent with hardware and software availability and with safety constraints.

  4. Principle of least decoherence for Newtonian semiclassical gravity

    NASA Astrophysics Data System (ADS)

    Tilloy, Antoine; Diósi, Lajos

    2017-11-01

    Recent works have proved that semiclassical theories of gravity needed not be fundamentally inconsistent, at least in the Newtonian regime. Using the machinery of continuous measurement theory and feedback, it was shown that one could construct well-behaved models of hybrid quantum-classical dynamics at the price of an imposed (nonunique) decoherence structure. We introduce a principle of least decoherence (PLD) which allows us to naturally single out a unique model from all the available options; up to some unspecified short distance regularization scale. Interestingly, the resulting model is found to coincide with the old—erstwhile only heuristically motivated—proposal of Penrose and one of us for gravity-related spontaneous decoherence and collapse. Finally, this paper suggests that it is in the submillimeter behavior of gravity that new phenomena might be found.

  5. Numerical analysis of fundamental mode selection of a He-Ne laser by a circular aperture

    NASA Astrophysics Data System (ADS)

    He, Xin; Zhang, Bin

    2011-11-01

    In the He-Ne laser with an integrated cavity made of zerodur, the inner face performance of the gain tube is limited by the machining techniques, which tends to influence the beam propagation and transverse mode distribution. In order to improve the beam quality and select out the fundamental mode, an aperture is usually introduced in the cavity. In the process of laser design, the Fresnel-Kirchhoff diffraction integral equation is adopted to calculate the optical field distributions on each interface. The transit matrix is obtained based on self-reproducing principle and finite element method. Thus, optical field distribution on any interface and field loss of each transverse mode could be acquired by solving the eigenvalue and eigenvector of the transit matrix. For different-sized apertures in different positions, we could get different matrices and corresponding calculation results. By comparing these results, the optimal size and position of the aperture could be obtained. As a result, the feasibility of selecting fundamental mode in a zerodur He-Ne laser by a circular aperture has been verified theoretically.

  6. Analytical minimization of synchronicity errors in stochastic identification

    NASA Astrophysics Data System (ADS)

    Bernal, D.

    2018-01-01

    An approach to minimize error due to synchronicity faults in stochastic system identification is presented. The scheme is based on shifting the time domain signals so the phases of the fundamental eigenvector estimated from the spectral density are zero. A threshold on the mean of the amplitude-weighted absolute value of these phases, above which signal shifting is deemed justified, is derived and found to be proportional to the first mode damping ratio. It is shown that synchronicity faults do not map precisely to phasor multiplications in subspace identification and that the accuracy of spectral density estimated eigenvectors, for inputs with arbitrary spectral density, decrease with increasing mode number. Selection of a corrective strategy based on signal alignment, instead of eigenvector adjustment using phasors, is shown to be the product of the foregoing observations. Simulations that include noise and non-classical damping suggest that the scheme can provide sufficient accuracy to be of practical value.

  7. Climate-smart conservation: putting adaption principles into practice

    USGS Publications Warehouse

    Stein, Bruce A.; Glick, Patty; Edelson, Naomi; Staudt, Amanda

    2014-01-01

    Climate change already is having significant impacts on the nation’s species and ecosystems, and these effects are projected to increase considerably over time. As a result, climate change is now a primary lens through which conservation and natural resource management must be viewed. How should we prepare for and respond to the impacts of climate change on wildlife and their habitats? What should we be doing differently in light of these climatic shifts, and what actions continue to make sense? Climate-Smart Conservation: Putting Adaptation Principles into Practice offers guidance for designing and carrying out conservation in the face of a rapidly changing climate. Addressing the growing threats brought about or accentuated by rapid climate change requires a fundamental shift in the practice of natural resource management and conservation. Traditionally, conservationists have focused their efforts on protecting and managing systems to maintain their current state, or to restore degraded systems back to a historical state regarded as more desirable. Conservation planners and practitioners will need to adopt forward-looking goals and implement strategies specifically designed to prepare for and adjust to current and future climatic changes, and the associated impacts on natural systems and human communities—an emerging discipline known as climate change adaptation. The field of climate change adaptation is still in its infancy. Although there is increasing attention focused on the subject, much of the guidance developed to date has been general in nature, concentrating on high-level principles rather than specific actions. It is against this backdrop that this guide was prepared as a means for helping put adaptation principles into practice, and for moving adaptation from planning to action.

  8. Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?

    PubMed

    Kulkarni, Arvind Gopalrao; Patel, Ravish Shammi; Dutta, Shumayou

    2016-12-01

    Retrospective review of prospectively collected data. To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an effective tool to minimize hospital costs.

  9. Principlism and communitarianism

    PubMed Central

    Callahan, D

    2003-01-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help. PMID:14519838

  10. Principlism and communitarianism.

    PubMed

    Callahan, D

    2003-10-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help.

  11. Phonon impedance matching: minimizing interfacial thermal resistance of thin films

    NASA Astrophysics Data System (ADS)

    Polanco, Carlos; Zhang, Jingjie; Ghosh, Avik

    2014-03-01

    The challenge to minimize interfacial thermal resistance is to allow a broad band spectrum of phonons, with non-linear dispersion and well defined translational and rotational symmetries, to cross the interface. We explain how to minimize this resistance using a frequency dependent broadening matrix that generalizes the notion of acoustic impedance to the whole phonon spectrum including symmetries. We show how to ``match'' two given materials by joining them with a single atomic layer, with a multilayer material and with a graded superlattice. Atomic layer ``matching'' requires a layer with a mass close to the arithmetic mean (or spring constant close to the harmonic mean) to favor high frequency phonon transmission. For multilayer ``matching,'' we want a material with a broadening close to the geometric mean to maximize transmission peaks. For graded superlattices, a continuous sequence of geometric means translates to an exponentially varying broadening that generates a wide-band antireflection coating for both the coherent and incoherent limits. Our results are supported by ``first principles'' calculations of thermal conductance for GaAs / Gax Al1 - x As / AlAs thin films using the Non-Equilibrium Greens Function formalism coupled with Density Functional Perturbation Theory. NSF-CAREER (QMHP 1028883), NSF-IDR (CBET 1134311), XSEDE.

  12. The science of research: the principles underlying the discovery of cognitive and other biological mechanisms.

    PubMed

    Silva, Alcino J

    2007-01-01

    Studies of cognitive function include a wide spectrum of disciplines, with very diverse theoretical and practical frameworks. For example, in Behavioral Neuroscience cognitive mechanisms are mostly inferred from loss of function (lesion) experiments while in Cognitive Neuroscience these mechanisms are commonly deduced from brain activation patterns. Although neuroscientists acknowledge the limitations of deriving conclusions using a limited scope of approaches, there are no systematically studied, objective and explicit criteria for what is required to test a given hypothesis of cognitive function. This problem plagues every discipline in science: scientific research lacks objective, systematic studies that validate the principles underlying even its most elemental practices. For example, scientists decide what experiments are best suited to test key ideas in their field, which hypotheses have sufficient supporting evidence and which require further investigation, which studies are important and which are not, based on intuitions derived from experience, implicit principles learned from mentors and colleagues, traditions in their fields, etc. Philosophers have made numerous attempts to articulate and frame the principles that guide research and innovation, but these speculative ideas have remained untested and have had a minimal impact on the work of scientists. Here, I propose the development of methods for systematically and objectively studying and improving the modus operandi of research and development. This effort (the science of scientific research or S2) will benefit all aspects of science, from education of young scientists to research, publishing and funding, since it will provide explicit and systematically tested frameworks for practices in science. To illustrate its goals, I will introduce a hypothesis (the Convergent Four) derived from experimental practices common in molecular and cellular biology. This S2 hypothesis proposes that there are at least

  13. Translating cryobiology principles into trans-disciplinary storage guidelines for biorepositories and biobanks: a concept paper.

    PubMed

    Benson, E; Betson, F; Fuller, B J; Harding, K; Kofanova, O

    2013-01-01

    Low temperatures are used routinely to preserve diverse biospecimens, genetic resources and non-viable or viable biosamples for medical and clinical research in hospital-based biobanks and non-medical biorepositories, such as genebanks and culture, scientific, museum, and environmental collections. However, the basic knowledge underpinning preservation can sometimes be overlooked by practitioners who are unfamiliar with fundamental cryobiological principles which are more usually described in research literature rather than in quality and risk management documents. Whilst procedures vary, low temperature storage is a common requirement and reaching consensus as to how best it is applied could facilitate the entire biopreservation sector. This may be achieved by encouraging an understanding of cryoprotection theory and emphasizing the criticality of thermal events (glass transitions, ice nucleation, thawing) for sample integrity, functionality and stability. The objective of this paper is to inspire diverse biopreservation sectors to communicate more clearly about low temperature storage and, raise awareness of the importance of cryobiology principles to field newcomers and biopreservation practitioners, by considering how the principles may be translated into evidence-based guidelines for biobank and biorepository operations.

  14. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  15. A Variational Principle for Reconstruction of Elastic Deformations in Shear Deformable Plates and Shells

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Spangler, Jan L.

    2003-01-01

    A variational principle is formulated for the inverse problem of full-field reconstruction of three-dimensional plate/shell deformations from experimentally measured surface strains. The formulation is based upon the minimization of a least squares functional that uses the complete set of strain measures consistent with linear, first-order shear-deformation theory. The formulation, which accommodates for transverse shear deformation, is applicable for the analysis of thin and moderately thick plate and shell structures. The main benefit of the variational principle is that it is well suited for C(sup 0)-continuous displacement finite element discretizations, thus enabling the development of robust algorithms for application to complex civil and aeronautical structures. The methodology is especially aimed at the next generation of aerospace vehicles for use in real-time structural health monitoring systems.

  16. Disasters and mass casualties: I. General principles of response and management.

    PubMed

    Born, Christopher T; Briggs, Susan M; Ciraulo, David L; Frykberg, Eric R; Hammond, Jeffrey S; Hirshberg, Asher; Lhowe, David W; O'Neill, Patricia A

    2007-07-01

    Disaster planning and response to a mass casualty incident pose unique demands on the medical community. Because they would be required to confront many casualties with bodily injury and surgical problems, surgeons in particular must become better educated in disaster management. Compared with routine practice, triage principles in disasters require an entirely different approach to evaluation and care and often run counter to training and ethical values. An effective response to disaster and mass casualty events should focus on an "all hazards" approach, defined as the ability to adapt and apply fundamental disaster management principles universally to any mass casualty incident, whether caused by people or nature. Organizational tools such as the Incident Command System and the Hospital Incident Command System help to effect a rapid and coordinated response to specific situations. The United States federal government, through the National Response Plan, has the responsibility to respond quickly and efficiently to catastrophic incidents and to ensure critical life-saving assistance. International medical surgical response teams are capable of providing medical, surgical, and intensive care services in austere environments anywhere in the world.

  17. A New Principle in Physiscs: the Principle "Finiteness", and Some Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abraham Sternlieb

    2010-06-25

    In this paper I propose a new principle in physics: the principle of "finiteness". It stems from the definition of physics as a science that deals (among other things) with measurable dimensional physical quantities. Since measurement results, including their errors, are always finite, the principle of finiteness postulates that the mathematical formulation of "legitimate" laws of physics should prevent exactly zero or infinite solutions. Some consequences of the principle of finiteness are discussed, in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The consequences are derived independently of any other theory ormore » principle in physics. I propose "finiteness" as a postulate (like the constancy of the speed of light in vacuum, "c"), as opposed to a notion whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories, or principles.« less

  18. The evolution of cell types in animals: emerging principles from molecular studies.

    PubMed

    Arendt, Detlev

    2008-11-01

    Cell types are fundamental units of multicellular life but their evolution is obscure. How did the first cell types emerge and become distinct in animal evolution? What were the sets of cell types that existed at important evolutionary nodes that represent eumetazoan or bilaterian ancestors? How did these ancient cell types diversify further during the evolution of organ systems in the descending evolutionary lines? The recent advent of cell type molecular fingerprinting has yielded initial insights into the evolutionary interrelationships of cell types between remote animal phyla and has allowed us to define some first principles of cell type diversification in animal evolution.

  19. Generalized uncertainty principle and quantum gravity phenomenology

    NASA Astrophysics Data System (ADS)

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  20. Evolutionary dynamics from a variational principle.

    PubMed

    Klimek, Peter; Thurner, Stefan; Hanel, Rudolf

    2010-07-01

    We demonstrate with a thought experiment that fitness-based population dynamical approaches to evolution are not able to make quantitative, falsifiable predictions about the long-term behavior of some evolutionary systems. A key characteristic of evolutionary systems is the ongoing endogenous production of new species. These novel entities change the conditions for already existing species. Even Darwin's Demon, a hypothetical entity with exact knowledge of the abundance of all species and their fitness functions at a given time, could not prestate the impact of these novelties on established populations. We argue that fitness is always a posteriori knowledge--it measures but does not explain why a species has reproductive success or not. To overcome these conceptual limitations, a variational principle is proposed in a spin-model-like setup of evolutionary systems. We derive a functional which is minimized under the most general evolutionary formulation of a dynamical system, i.e., evolutionary trajectories causally emerge as a minimization of a functional. This functional allows the derivation of analytic solutions of the asymptotic diversity for stochastic evolutionary systems within a mean-field approximation. We test these approximations by numerical simulations of the corresponding model and find good agreement in the position of phase transitions in diversity curves. The model is further able to reproduce stylized facts of timeseries from several man-made and natural evolutionary systems. Light will be thrown on how species and their fitness landscapes dynamically coevolve.

  1. Principled negotiation and distributed optimization for advanced air traffic management

    NASA Astrophysics Data System (ADS)

    Wangermann, John Paul

    Today's aircraft/airspace system faces complex challenges. Congestion and delays are widespread as air traffic continues to grow. Airlines want to better optimize their operations, and general aviation wants easier access to the system. Additionally, the accident rate must decline just to keep the number of accidents each year constant. New technology provides an opportunity to rethink the air traffic management process. Faster computers, new sensors, and high-bandwidth communications can be used to create new operating models. The choice is no longer between "inflexible" strategic separation assurance and "flexible" tactical conflict resolution. With suitable operating procedures, it is possible to have strategic, four-dimensional separation assurance that is flexible and allows system users maximum freedom to optimize operations. This thesis describes an operating model based on principled negotiation between agents. Many multi-agent systems have agents that have different, competing interests but have a shared interest in coordinating their actions. Principled negotiation is a method of finding agreement between agents with different interests. By focusing on fundamental interests and searching for options for mutual gain, agents with different interests reach agreements that provide benefits for both sides. Using principled negotiation, distributed optimization by each agent can be coordinated leading to iterative optimization of the system. Principled negotiation is well-suited to aircraft/airspace systems. It allows aircraft and operators to propose changes to air traffic control. Air traffic managers check the proposal maintains required aircraft separation. If it does, the proposal is either accepted or passed to agents whose trajectories change as part of the proposal for approval. Aircraft and operators can use all the data at hand to develop proposals that optimize their operations, while traffic managers can focus on their primary duty of ensuring

  2. Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?

    NASA Astrophysics Data System (ADS)

    Majumder, Barun; Sen, Sourav

    2012-10-01

    In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.

  3. First-Principles Prediction of Thermodynamically Stable Two-Dimensional Electrides

    DOE PAGES

    Ming, Wenmei; Yoon, Mina; Univ. of Tennessee, Knoxville, TN; ...

    2016-10-21

    Two-dimensional (2D) electrides, emerging as a new type of layered material whose electrons are confined in interlayer spaces instead of at atomic proximities, are receiving interest for their high performance in various (opto)electronics and catalytic applications. Experimentally, however, 2D electrides have been only found in a couple of layered nitrides and carbides. We report new thermodynamically stable alkaline-earth based 2D electrides by using a first-principles global structure optimization method, phonon spectrum analysis, and molecular dynamics simulation. The method was applied to binary compounds consisting of alkaline-earth elements as cations and group VA, VIA, or VIIA nonmetal elements as anions. Wemore » also revealed that the stability of a layered 2D electride structure is closely related to the cation/anion size ratio; stable 2D electrides possess a sufficiently large cation/anion size ratio to minimize electrostatic energy among cations, anions, and anionic electrons. This work demonstrates a new avenue to the discovery of thermodynamically stable 2D electrides beyond experimental material databases and provides new insight into the principles of electride design.« less

  4. Fundamental characteristics of a dual-colour fibre optic SPR sensor

    NASA Astrophysics Data System (ADS)

    Suzuki, Hitoshi; Sugimoto, Mitsunori; Matsui, Yoshikazu; Kondoh, Jun

    2006-06-01

    In this paper, we present the fundamental characteristics of a novel dual-colour optical fibre surface plasmon resonance (SPR) sensor for a portable low-cost sensing system. The principle of the proposed SPR sensor is based on the differential reflectance method. Light from two light-emitting diodes (LEDs), which are flashing alternately with different wavelengths, is fed to a sensor via two optical couplers. The reflected light is detected by a photodiode. Changes of reflectance at two wavelengths are proportional to the refractive index change of the medium of interest. Taking the difference in reflectance at two wavelengths improves the sensitivity almost twofold. Measuring ethanol solutions with different refractive indices reveals that the sensor has a linear response to the refractive index change from 1.333 to 1.3616. By measuring the stability in the time response we estimate that the limit of detection (LOD) of the refractive index is 5.2 × 10-4.

  5. Surgical treatment of osteoporotic fractures: An update on the principles of management.

    PubMed

    Yaacobi, Eyal; Sanchez, Daniela; Maniar, Hemil; Horwitz, Daniel S

    2017-12-01

    The treatment of osteoporotic fractures continues to challenge orthopedic surgeon. The fragility of the underlying bone in conjunction with the need for specific implants led to the development of explicit surgical techniques in order to minimize implant failure related complications, morbidity and mortality. From the patient's perspective, the existence of frailty, dementia and other medical related co-morbidities induce a complex situation necessitating high vigilance during the perioperative and post-operative period. This update reviews current principles and techniques essential to successful surgical treatment of these injuries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Principles that underpin effective school-based drug education.

    PubMed

    Midford, Richard; Munro, Geoffrey; McBride, Nyanda; Snow, Pamela; Ladzinski, Ursula

    2002-01-01

    This study identifies the conceptual underpinnings of effective school-based drug education practice in light of contemporary research evidence and the practical experience of a broad range of drug education stakeholders. The research involved a review of the literature, a national survey of 210 Australian teachers and others involved in drug education, and structured interviews with 22 key Australian drug education policy stakeholders. The findings from this research have been distilled and presented as a list of 16 principles that underpin effective drug education. In broad terms, drug education should be evidence-based, developmentally appropriate, sequential, and contextual. Programs should be initiated before drug use commences. Strategies should be linked to goals and should incorporate harm minimization. Teaching should be interactive and use peer leaders. The role of the classroom teacher is central. Certain program content is important, as is social and resistance skills training. Community values, the social context of use, and the nature of drug harm have to be addressed. Coverage needs to be adequate and supported by follow-up. It is envisaged that these principles will provide all those involved in the drug education field with a set of up-to-date, research-based guidelines against which to reference decisions on program design, selection, implementation, and evaluation.

  7. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  8. Hexavalent Chromium Minimization Strategy

    DTIC Science & Technology

    2011-05-01

    Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  9. Exact symmetries in the velocity fluctuations of a hot Brownian swimmer

    NASA Astrophysics Data System (ADS)

    Falasco, Gianmaria; Pfaller, Richard; Bregulla, Andreas P.; Cichos, Frank; Kroy, Klaus

    2016-09-01

    Symmetries constrain dynamics. We test this fundamental physical principle, experimentally and by molecular dynamics simulations, for a hot Janus swimmer operating far from thermal equilibrium. Our results establish scalar and vectorial steady-state fluctuation theorems and a thermodynamic uncertainty relation that link the fluctuating particle current to its entropy production at an effective temperature. A Markovian minimal model elucidates the underlying nonequilibrium physics.

  10. Ethics fundamentals.

    PubMed

    Chambers, David W

    2011-01-01

    Ethics is about studying the right and the good; morality is about acting as one should. Although there are differences among what is legal, charitable, professional, ethical, and moral, these desirable characteristics tend to cluster and are treasured in dentistry. The traditional approach to professionalism in dentistry is based on a theory of biomedical ethics advanced 30 years ago. Known as the principles approach, general ideals such as respect for autonomy, nonmaleficence, beneficence, justice, and veracity, are offered as guides. Growth in professionalism consists in learning to interpret the application of these principles as one's peers do. Moral behavior is conceived as a continuous cycle of sensitivity to situations requiring moral response, moral reasoning, the moral courage to take action when necessary, and integration of habits of moral behavior into one's character. This essay is the first of two papers that provide the backbone for the IDEA Project of the College--an online, multiformat, interactive "textbook" of ethics for the profession.

  11. 32 CFR 2001.16 - Fundamental classification guidance review.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Fundamental classification guidance review. 2001... INFORMATION Classification § 2001.16 Fundamental classification guidance review. (a) Performance of fundamental classification guidance reviews. An initial fundamental classification guidance review shall be...

  12. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  13. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  14. Children acquire the later-greater principle after the cardinal principle

    PubMed Central

    Le Corre, Mathieu

    2014-01-01

    Many have proposed that the acquisition of the cardinal principle is a result of the discovery of the numerical significance of the order of the number words in the count list. However, this need not be the case. Indeed, the cardinal principle does not state anything about the numerical significance of the order of the number words. It only states that the last word of a correct count denotes the numerosity of the counted set. Here we test whether the acquisition of the cardinal principle involves the discovery of the later-greater principle – i.e., that the order of the number words corresponds to the relative size of the numerosities they denote. Specifically, we tested knowledge of verbal numerical comparisons (e.g., Is “ten” more than “six”?) in children who had recently learned the cardinal principle. We find that these children can compare number words between “six” and “ten” only if they have mapped them onto non-verbal representations of numerosity. We suggest that this means that the acquisition of the cardinal principle does not involve the discovery of the correspondence between the order of the number words and the relative size of the numerosities they denote. PMID:24372336

  15. Fundamentals of Geophysics

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Choosing an intermediate-level geophysics text is always problematic: What should we teach students after they have had introductory courses in geology, math, and physics, but little else? Fundamentals of Geophysics is aimed specifically at these intermediate-level students, and the author's stated approach is to construct a text “using abundant diagrams, a simplified mathematical treatment, and equations in which the student can follow each derivation step-by-step.” Moreover, for Lowrie, the Earth is round, not flat—the “fundamentals of geophysics” here are the essential properties of our Earth the planet, rather than useful techniques for finding oil and minerals. Thus this book is comparable in both level and approach to C. M. R. Fowler's The Solid Earth (Cambridge University Press, 1990).

  16. Proposed correlation of modern processing principles for Ayurvedic herbal drug manufacturing: A systematic review.

    PubMed

    Jain, Rahi; Venkatasubramanian, Padma

    2014-01-01

    Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.

  17. Size principle and information theory.

    PubMed

    Senn, W; Wyler, K; Clamann, H P; Kleinle, J; Lüscher, H R; Müller, L

    1997-01-01

    The motor units of a skeletal muscle may be recruited according to different strategies. From all possible recruitment strategies nature selected the simplest one: in most actions of vertebrate skeletal muscles the recruitment of its motor units is by increasing size. This so-called size principle permits a high precision in muscle force generation since small muscle forces are produced exclusively by small motor units. Larger motor units are activated only if the total muscle force has already reached certain critical levels. We show that this recruitment by size is not only optimal in precision but also optimal in an information theoretical sense. We consider the motoneuron pool as an encoder generating a parallel binary code from a common input to that pool. The generated motoneuron code is sent down through the motoneuron axons to the muscle. We establish that an optimization of this motoneuron code with respect to its information content is equivalent to the recruitment of motor units by size. Moreover, maximal information content of the motoneuron code is equivalent to a minimal expected error in muscle force generation.

  18. Fundamental device design considerations in the development of disruptive nanoelectronics.

    PubMed

    Singh, R; Poole, J O; Poole, K F; Vaidya, S D

    2002-01-01

    In the last quarter of a century silicon-based integrated circuits (ICs) have played a major role in the growth of the economy throughout the world. A number of new technologies, such as quantum computing, molecular computing, DNA molecules for computing, etc., are currently being explored to create a product to replace semiconductor transistor technology. We have examined all of the currently explored options and found that none of these options are suitable as silicon IC's replacements. In this paper we provide fundamental device criteria that must be satisfied for the successful operation of a manufacturable, not yet invented, device. The two fundamental limits are the removal of heat and reliability. The switching speed of any practical man-made computing device will be in the range of 10(-15) to 10(-3) s. Heisenberg's uncertainty principle and the computer architecture set the heat generation limit. The thermal conductivity of the materials used in the fabrication of a nanodimensional device sets the heat removal limit. In current electronic products, redundancy plays a significant part in improving the reliability of parts with macroscopic defects. In the future, microscopic and even nanoscopic defects will play a critical role in the reliability of disruptive nanoelectronics. The lattice vibrations will set the intrinsic reliability of future computing systems. The two critical limits discussed in this paper provide criteria for the selection of materials used in the fabrication of future devices. Our work shows that diamond contains the clue to providing computing devices that will surpass the performance of silicon-based nanoelectronics.

  19. Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support.

    PubMed

    Nahum-Shani, Inbal; Smith, Shawna N; Spring, Bonnie J; Collins, Linda M; Witkiewitz, Katie; Tewari, Ambuj; Murphy, Susan A

    2018-05-18

    The just-in-time adaptive intervention (JITAI) is an intervention design aiming to provide the right type/amount of support, at the right time, by adapting to an individual's changing internal and contextual state. The availability of increasingly powerful mobile and sensing technologies underpins the use of JITAIs to support health behavior, as in such a setting an individual's state can change rapidly, unexpectedly, and in his/her natural environment. Despite the increasing use and appeal of JITAIs, a major gap exists between the growing technological capabilities for delivering JITAIs and research on the development and evaluation of these interventions. Many JITAIs have been developed with minimal use of empirical evidence, theory, or accepted treatment guidelines. Here, we take an essential first step towards bridging this gap. Building on health behavior theories and the extant literature on JITAIs, we clarify the scientific motivation for JITAIs, define their fundamental components, and highlight design principles related to these components. Examples of JITAIs from various domains of health behavior research are used for illustration. As we enter a new era of technological capacity for delivering JITAIs, it is critical that researchers develop sophisticated and nuanced health behavior theories capable of guiding the construction of such interventions. Particular attention has to be given to better understanding the implications of providing timely and ecologically sound support for intervention adherence and retention.

  20. Understanding the decision-making environment for people in minimally conscious state.

    PubMed

    Yelden, Kudret; Sargent, Sarah; Samanta, Jo

    2017-04-11

    Patients in minimally conscious state (MCS) show minimal, fluctuating but definitive signs of awareness of themselves and their environments. They may exhibit behaviours ranging from the ability to track objects or people with their eyes, to the making of simple choices which requires the ability to recognise objects and follow simple commands. While patients with MCS have higher chances of further recovery than people in vegetative states, this is not guaranteed and their prognosis is fundamentally uncertain. Therefore, patients with MCS need regular input from healthcare professionals to monitor their progress (or non-progress) and to address their needs for rehabilitation, for the provision of an appropriate environment and equipment. These requirements form a backdrop to the potentially huge variety of ethical-legal dilemmas that may be faced by their families, caregivers and ultimately, the courts. This paper analyses the decision-making environment for people with MCS using data obtained through four focus groups which included the input of 29 senior decision makers in the area. The results of the focus group study are presented and further explored with attention on recurrent and strong themes such as lack of expertise, resource issues, and the influence of families and friends of people with MCS.

  1. A Reassessment of George Pierce Baker's "The Principles of Argumentation": Minimizing the Use of Formal Logic in Favor of Practical Approaches

    ERIC Educational Resources Information Center

    Bordelon, Suzanne

    2006-01-01

    In this article, the author demonstrated how recent histories relied primarily on previous accounts and one textbook to characterize George Pierce Baker's work. This narrow assessment of "The Principles of Argumentation" limits one's understanding of his contribution to argumentation theory and pedagogy. Similarly, one has seen the need for care…

  2. Fundamentals of fluid sealing

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamentals of fluid sealing, including seal operating regimes, are discussed and the general fluid-flow equations for fluid sealing are developed. Seal performance parameters such as leakage and power loss are presented. Included in the discussion are the effects of geometry, surface deformations, rotation, and both laminar and turbulent flows. The concept of pressure balancing is presented, as are differences between liquid and gas sealing. Mechanisms of seal surface separation, fundamental friction and wear concepts applicable to seals, seal materials, and pressure-velocity (PV) criteria are discussed.

  3. Host Biology in Light of the Microbiome: Ten Principles of Holobionts and Hologenomes

    PubMed Central

    Bordenstein, Seth R.; Theis, Kevin R.

    2015-01-01

    Groundbreaking research on the universality and diversity of microorganisms is now challenging the life sciences to upgrade fundamental theories that once seemed untouchable. To fully appreciate the change that the field is now undergoing, one has to place the epochs and foundational principles of Darwin, Mendel, and the modern synthesis in light of the current advances that are enabling a new vision for the central importance of microbiology. Animals and plants are no longer heralded as autonomous entities but rather as biomolecular networks composed of the host plus its associated microbes, i.e., "holobionts." As such, their collective genomes forge a "hologenome," and models of animal and plant biology that do not account for these intergenomic associations are incomplete. Here, we integrate these concepts into historical and contemporary visions of biology and summarize a predictive and refutable framework for their evaluation. Specifically, we present ten principles that clarify and append what these concepts are and are not, explain how they both support and extend existing theory in the life sciences, and discuss their potential ramifications for the multifaceted approaches of zoology and botany. We anticipate that the conceptual and evidence-based foundation provided in this essay will serve as a roadmap for hypothesis-driven, experimentally validated research on holobionts and their hologenomes, thereby catalyzing the continued fusion of biology's subdisciplines. At a time when symbiotic microbes are recognized as fundamental to all aspects of animal and plant biology, the holobiont and hologenome concepts afford a holistic view of biological complexity that is consistent with the generally reductionist approaches of biology. PMID:26284777

  4. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  5. A REVIEW OF THE FUNDAMENTAL PRINCIPLES OF RADIATION PROTECTION WHEN APPLIED TO THE PATIENT IN DIAGNOSTIC RADIOLOGY.

    PubMed

    Moores, B Michael

    2017-06-01

    A review of the role and relevance of the principles of radiation protection of the patient in diagnostic radiology as specified by ICRP has been undertaken when diagnostic risks arising from an examination are taken into account. The increase in population doses arising from diagnostic radiology over the past 20 years has been due to the widespread application of higher dose CT examinations that provide significantly more clinical information. Consequently, diagnostic risks as well as radiation risks need to be considered within the patient radiation protection framework. Justification and optimisation are discussed and the limitations imposed on patient protection by employing only a radiation risk framework is highlighted. The example of radiation protection of the patient in breast screening programmes employing mammography is used to highlight the importance of defined diagnostic outcomes in any effective radiation protection strategy. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. A Risk Assessment Matrix for Public Health Principles: The Case for E-Cigarettes.

    PubMed

    Saitta, Daniela; Chowdhury, Azim; Ferro, Giancarlo Antonio; Nalis, Federico Giuseppe; Polosa, Riccardo

    2017-03-31

    Besides nicotine replacement therapies, a realistic alternative for smoking cessation or for smoking substitution may come from electronic cigarettes (ECs), whose popularity has been steadily growing. As for any emerging behaviour associated with exposure to inhalational agents, there is legitimate cause for concern and many health organizations and policy makers have pushed for restrictive policy measures ranging from complete bans to tight regulations of these products. Nonetheless, it is important to reframe these concerns in context of the well-known harm caused by cigarette smoking. In this article, we discuss key public health principles that should be considered when regulating ECs. These include the concept of tobacco harm reduction, importance of relative risk and risk continuum, renormalization of smoking, availability of low-risk product, proportionate taxation, and reassessment of the role of non-tobacco flavours. These public health principles may be systematically scrutinized using a risk assessment matrix that allows: (1) to determine the measure of certainty that a risk will occur; and (2) to estimate the impact of such a risk on public health. Consequently, the ultimate goal of responsible ECs regulation should be that of maximizing the favourable impact of these reduced-risk products whilst minimizing further any potential risks. Consumer perspectives, sound EC research, continuous post-marketing surveillance and reasonable safety and quality product standards should be at the very heart of future regulatory schemes that will address concerns while minimizing unintended consequences of ill-informed regulation.

  7. A Risk Assessment Matrix for Public Health Principles: The Case for E-Cigarettes

    PubMed Central

    Saitta, Daniela; Chowdhury, Azim; Ferro, Giancarlo Antonio; Nalis, Federico Giuseppe; Polosa, Riccardo

    2017-01-01

    Besides nicotine replacement therapies, a realistic alternative for smoking cessation or for smoking substitution may come from electronic cigarettes (ECs), whose popularity has been steadily growing. As for any emerging behaviour associated with exposure to inhalational agents, there is legitimate cause for concern and many health organizations and policy makers have pushed for restrictive policy measures ranging from complete bans to tight regulations of these products. Nonetheless, it is important to reframe these concerns in context of the well-known harm caused by cigarette smoking. In this article, we discuss key public health principles that should be considered when regulating ECs. These include the concept of tobacco harm reduction, importance of relative risk and risk continuum, renormalization of smoking, availability of low-risk product, proportionate taxation, and reassessment of the role of non-tobacco flavours. These public health principles may be systematically scrutinized using a risk assessment matrix that allows: (1) to determine the measure of certainty that a risk will occur; and (2) to estimate the impact of such a risk on public health. Consequently, the ultimate goal of responsible ECs regulation should be that of maximizing the favourable impact of these reduced-risk products whilst minimizing further any potential risks. Consumer perspectives, sound EC research, continuous post-marketing surveillance and reasonable safety and quality product standards should be at the very heart of future regulatory schemes that will address concerns while minimizing unintended consequences of ill-informed regulation. PMID:28362360

  8. Reinventing Design Principles for Developing Low-Viscosity Carbon Dioxide-Binding Organic Liquids for Flue Gas Clean Up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2017-01-11

    Anthropogenic carbon dioxide (CO 2) emission from point sources, such as coal fired-power plants, account for the majority of the green houses gasses in the atmosphere. Capture, storage and utilization are required to mitigate adverse environmental effects. Aqueous amine-based CO 2 capture solvents are currently considered the industry standard, but deployment to market is limited by their high regeneration energy demand. In that context, energy efficient and less-viscous water-lean transformational solvent systems known as CO 2 Binding Organic Liquids (CO 2BOLs) are being developed in our group to advance this technology to commercialization. Herein, we present a logical design approachmore » based on fundamental concepts of organic chemistry and computer simulations aimed at lowering solvent viscosity. Conceptually, viscosity reduction would be achieved by systemmatic methods such as introduction of steric hindrance on the anion to minimize the intermolecular cation-anion interactions, fine tuning the electronics, hydrogen bonding orientation and strength, and charge solvation. Conventional standard trial-and-error approaches while effective, are time consuming and economically expensive. Herein, we rethink the metrics and design principles of low-viscosity CO 2 capture solvents using a combined synthesis and computational modeling approach. We critically study the impacts of modyfying factors such as as orientation of hydrogen bonding, introduction of higher degrees of freedom and cation or anion charge solvation and assess if or how each factor impacts viscosity of CO 2BOL CO 2 capture solvents. Ultimately, we found that hydrogen bond orientation and strength is predominantly influencing the viscosity in CO 2BOL solvents. With this knowledge, a new 1-MEIPADM-2-BOL CO 2BOL variant was synthesized and tested, resulting in a solvent that is approximately 60% less viscous at 25 mol% CO 2 loading with respect to our base compound 1-IPADM-2-BOL. The insights gained

  9. Principles, Techniques, and Applications of Tissue Microfluidics

    NASA Technical Reports Server (NTRS)

    Wade, Lawrence A.; Kartalov, Emil P.; Shibata, Darryl; Taylor, Clive

    2011-01-01

    The principle of tissue microfluidics and its resultant techniques has been applied to cell analysis. Building microfluidics to suit a particular tissue sample would allow the rapid, reliable, inexpensive, highly parallelized, selective extraction of chosen regions of tissue for purposes of further biochemical analysis. Furthermore, the applicability of the techniques ranges beyond the described pathology application. For example, they would also allow the posing and successful answering of new sets of questions in many areas of fundamental research. The proposed integration of microfluidic techniques and tissue slice samples is called "tissue microfluidics" because it molds the microfluidic architectures in accordance with each particular structure of each specific tissue sample. Thus, microfluidics can be built around the tissues, following the tissue structure, or alternatively, the microfluidics can be adapted to the specific geometry of particular tissues. By contrast, the traditional approach is that microfluidic devices are structured in accordance with engineering considerations, while the biological components in applied devices are forced to comply with these engineering presets.

  10. Mastering the Concepts of Geologic Time: Novice Students' Understanding of the Principles of Relative Age

    NASA Astrophysics Data System (ADS)

    Speta, M.; Reid, L.

    2010-12-01

    Misconceptions can adversely affect students’ mastery of the fundamental geoscience concepts necessary for development of the knowledge base required to become a professional geoscientist. In the fall of 2009, in-class learning assessments were introduced into a large (400 student) undergraduate introductory geoscience course to help students develop expert-like problem solving skills for geologic problems. They were also designed to reveal students’ misconceptions on geoscience concepts in order to help direct the course of instruction. These assessments were based on simple, real-world scenarios that geoscientists encounter in their research. One of these assessments focused on the application of concepts of geologic time. It asked students to give the relative ages of granite, schist and shale based on a sketch of two outcrops, and to describe the reasoning behind their answer. In order to test all of the principles of relative age, the assignment had two possible solutions. A post-course analysis of student responses on these assessments was carried out using a modified constant comparative analysis method to identify common misconceptions. This analysis revealed that 61% of students failed to identify both possible solutions. Furthermore, 55% of students applied the principle of superposition to intrusive igneous and metamorphic rocks, and 18% treated the once connected outcrops as having separate geologic histories. 56% of students could not support their proposed geologic history with appropriate reasoning. These results suggest that the principles of relative geologic time that students had the greatest difficulty with were when to apply the principle of superposition and how to apply the principle of original continuity. Students also had difficulty using the principles of relative age to provide appropriate scientific reasoning for their choices.

  11. Metabolic principles of river basin organization.

    PubMed

    Rodriguez-Iturbe, Ignacio; Caylor, Kelly K; Rinaldo, Andrea

    2011-07-19

    The metabolism of a river basin is defined as the set of processes through which the basin maintains its structure and responds to its environment. Green (or biotic) metabolism is measured via transpiration and blue (or abiotic) metabolism through runoff. A principle of equal metabolic rate per unit area throughout the basin structure is developed and tested in a river basin characterized by large heterogeneities in precipitation, vegetation, soil, and geomorphology. This principle is suggested to have profound implications for the spatial organization of river basin hydrologic dynamics, including the minimization of energy expenditure known to control the scale-invariant characteristics of river networks over several orders of magnitude. Empirically derived, remarkably constant rates of average transpiration per unit area through the basin structure lead to a power law for the probability distribution of transpiration from a randomly chosen subbasin. The average runoff per unit area, evaluated for subbasins of a wide range of topological magnitudes, is also shown to be remarkably constant independently of size. A similar result is found for the rainfall after accounting for canopy interception. Allometric scaling of metabolic rates with size, variously addressed in the biological literature and network theory under the label of Kleiber's law, is similarly derived. The empirical evidence suggests that river basin metabolic activity is linked with the spatial organization that takes place around the drainage network and therefore with the mechanisms responsible for the fractal geometry of the network, suggesting a new coevolutionary framework for biological, geomorphological, and hydrologic dynamics.

  12. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  13. A consistent methodology for optimal shape design of graphene sheets to maximize their fundamental frequencies considering topological defects

    NASA Astrophysics Data System (ADS)

    Shi, Jin-Xing; Ohmura, Keiichiro; Shimoda, Masatoshi; Lei, Xiao-Wen

    2018-07-01

    In recent years, shape design of graphene sheets (GSs) by introducing topological defects for enhancing their mechanical behaviors has attracted the attention of scholars. In the present work, we propose a consistent methodology for optimal shape design of GSs using a combination of the molecular mechanics (MM) method, the non-parametric shape optimization method, the phase field crystal (PFC) method, Voronoi tessellation, and molecular dynamics (MD) simulation to maximize their fundamental frequencies. At first, we model GSs as continuum frame models using a link between the MM method and continuum mechanics. Then, we carry out optimal shape design of GSs in fundamental frequency maximization problem based on a developed shape optimization method for frames. However, the obtained optimal shapes of GSs only consisting of hexagonal carbon rings are unstable that do not satisfy the principle of least action, so we relocate carbon atoms on the optimal shapes by introducing topological defects using the PFC method and Voronoi tessellation. At last, we perform the structural relaxation through MD simulation to determine the final optimal shapes of GSs. We design two examples of GSs and the optimal results show that the fundamental frequencies of GSs can be significantly enhanced according to the optimal shape design methodology.

  14. Basic principles and ecological consequences of changing water regimes: riparian plant communities.

    PubMed

    Nilsson, Christer; Svedmark, Magnus

    2002-10-01

    Recent research has emphasized the importance of riparian ecosystems as centers of biodiversity and links between terrestrial and aquatic systems. Riparian ecosystems also belong among the environments that are most disturbed by humans and are in need of restoration to maintain biodiversity and ecological integrity. To facilitate the completion of this task, researchers have an important function to communicate their knowledge to policy-makers and managers. This article presents some fundamental qualities of riparian systems, articulated as three basic principles. The basic principles proposed are: (1) The flow regime determines the successional evolution of riparian plant communities and ecological processes. (2) The riparian corridor serves as a pathway for redistribution of organic and inorganic material that influences plant communities along rivers. (3) The riparian system is a transition zone between land and water ecosystems and is disproportionately plant species-rich when compared to surrounding ecosystems. Translating these principles into management directives requires more information about how much water a river needs and when and how, i.e., flow variables described by magnitude, frequency, timing, duration, and rate of change. It also requires information about how various groups of organisms are affected by habitat fragmentation, especially in terms of their dispersal. Finally, it requires information about how effects of hydrologic alterations vary between different types of riparian systems and with the location within the watershed.

  15. Weak Galilean invariance as a selection principle for coarse-grained diffusive models.

    PubMed

    Cairoli, Andrea; Klages, Rainer; Baule, Adrian

    2018-05-29

    How does the mathematical description of a system change in different reference frames? Galilei first addressed this fundamental question by formulating the famous principle of Galilean invariance. It prescribes that the equations of motion of closed systems remain the same in different inertial frames related by Galilean transformations, thus imposing strong constraints on the dynamical rules. However, real world systems are often described by coarse-grained models integrating complex internal and external interactions indistinguishably as friction and stochastic forces. Since Galilean invariance is then violated, there is seemingly no alternative principle to assess a priori the physical consistency of a given stochastic model in different inertial frames. Here, starting from the Kac-Zwanzig Hamiltonian model generating Brownian motion, we show how Galilean invariance is broken during the coarse-graining procedure when deriving stochastic equations. Our analysis leads to a set of rules characterizing systems in different inertial frames that have to be satisfied by general stochastic models, which we call "weak Galilean invariance." Several well-known stochastic processes are invariant in these terms, except the continuous-time random walk for which we derive the correct invariant description. Our results are particularly relevant for the modeling of biological systems, as they provide a theoretical principle to select physically consistent stochastic models before a validation against experimental data.

  16. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  17. Neutrons and Fundamental Symmetries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plaster, Bradley

    2016-01-11

    The research supported by this project addressed fundamental open physics questions via experiments with subatomic particles. In particular, neutrons constitute an especially ideal “laboratory” for fundamental physics tests, as their sensitivities to the four known forces of nature permit a broad range of tests of the so-called “Standard Model”, our current best physics model for the interactions of subatomic particles. Although the Standard Model has been a triumphant success for physics, it does not provide satisfactory answers to some of the most fundamental open questions in physics, such as: are there additional forces of nature beyond the gravitational, electromagnetic, weakmore » nuclear, and strong nuclear forces?, or why does our universe consist of more matter than anti-matter? This project also contributed significantly to the training of the next generation of scientists, of considerable value to the public. Young scientists, ranging from undergraduate students to graduate students to post-doctoral researchers, made significant contributions to the work carried out under this project.« less

  18. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  19. Microeconomic principles explain an optimal genome size in bacteria.

    PubMed

    Ranea, Juan A G; Grant, Alastair; Thornton, Janet M; Orengo, Christine A

    2005-01-01

    Bacteria can clearly enhance their survival by expanding their genetic repertoire. However, the tight packing of the bacterial genome and the fact that the most evolved species do not necessarily have the biggest genomes suggest there are other evolutionary factors limiting their genome expansion. To clarify these restrictions on size, we studied those protein families contributing most significantly to bacterial-genome complexity. We found that all bacteria apply the same basic and ancestral 'molecular technology' to optimize their reproductive efficiency. The same microeconomics principles that define the optimum size in a factory can also explain the existence of a statistical optimum in bacterial genome size. This optimum is reached when the bacterial genome obtains the maximum metabolic complexity (revenue) for minimal regulatory genes (logistic cost).

  20. The relative entropy is fundamental to adaptive resolution simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreis, Karsten; Graduate School Materials Science in Mainz, Staudingerweg 9, 55128 Mainz; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de

    Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy withmore » respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.« less